Friday, June 17, 2011

Don't Know means, apparently, Don't Know

In the latest Journal of Politics, Robert Luskin and John Bullock examine whether "don't know" as an available response to questions about political knowledge can affect the results.  The study itself is here (assuming you have the same access as I do).

In survey research, we often talk about DK (don't know) and how to present this option, and whether presenting it in a certain way encourages people to take the easy road and simply say they don't know versus coming up, with a little more mental effort, an answer.  This matters, at least to those of us who study the knowledge of the American electorate.  Encouraging "don't know" as a response can lead to a more disappointing portrait of the public's knowledge.  Or, to flip it, discouraging this response would paint a better picture.

The authors used to national survey experiments to explore whether discouraging DKs will matter.  As they write:
Discouraging DKs does paint a more comforting picture of the public’s knowledge of politics—but, as the foregoing shows, only slightly so in the openended case and spuriously so in the closed-ended one. Anyone searching for large caches of hidden knowledge, it appears, should look elsewhere.
Given most of the time we rely on close-ended questions to tap into the public's political knowledge, the results basically tell us that fiddling with "don't know" as a response, usually by discouraging it or not offering it as an easily available response alternative in a survey scenario, won't really improve the results.  Indeed, the authors argue that DKs should not be discouraged.  The only question, they argue, is whether to encourage them.

Yes, this gets all manner of methodologically geeky, but for those who study what people know, and who craft surveys or other instruments designed to measure it, these results are important.

Full journal cite: The Journal of Politics, Vol. 73, No. 2, April 2011, Pp. 547–557.

No comments: