Monday, March 31, 2008

Don't Know

The latest issue of Public Opinion Quarterly includes an article asking a simple question: should survey questions about political knowledge include a "don't know" response, or should respondents be forced to guess if they do not know the answer.

A series of articles by Jeffery Mondak and his colleagues suggest that perhaps it is best to create a forced choice. Encouraging "don't knows" "invites a guessing response set" perhaps linked to personality differences, they argue, and thus is biased.

Most measures of political knowledge include a "don't know" or "no answer" alternative. Is there a systemmatic bias? Can this screw up the results in some underlying way? A POQ piece by Patrick Sturgis, Nick Allum, and Patten Smith that showed up in my mailbox today suggests that "don't know" is, at least from what can tell today, a perfectly fine approach to questions of knowledge. Their experiment finds no obvious distortions, although they recommend further work to tease out potential issues.

My own gut feeling is that "don't know" is a good alternative, that we can't really tell what are guesses, even good guesses, versus bad ones that by sheer dumb luck turn out to be correct. I've also wondered whether the 1 = correct, 0 = otherwise is a good coding scheme for knowledge items. Shouldn't an incorrect response be coded as -1, a 0 reserved for no answer or don't know, and then a 1 for correct responses? An incorrect response seems to me to be qualitatively different than no answer at all. And yet like many others, I use the 1 if correct, 0 otherwise approach.

Old habits, even old methodological habits, die hard.

Friday, March 28, 2008

Screwing the Pooch (NES knowledge questions)

To screw the pooch means to royally screw up. That may be a bit of an overstatement, but there are signs that the ANES seriously had intimate relations with the pooch in how it handled political knowledge questions over several years.

Background: The ANES (American National Election Studies) conducts surveys every two years and is the source of a large part of political research done in academic journals

In a 24-page memo on serious issues concerning the validity of their knowledge questions, NES notes that "ANES political knowledge coding from this period may provide misleadingly unflattering portraits of the American public’s possession of factual information on the topics covered by ANES political knowledge questions."

Oops.

This matters. It matters because journal articles have been written based on these data. It matters because whole friggin books have been written based on these data.

In fairness, the ANES folks do a helluva job and I'm sure they are deeply embarassed and baffled and probably a bit pissed over all this. And there is a silver lining, or so they argue. After a couple of scholars discovered the problem, the ANES leaders concluded:

We are grateful to Professors Gibson and Caldeira for bringing questions about the political knowledge data to our attention. We also thank the ANES and SRC staffs for facilitating our investigation of past practices. We apologize to the user community for any negative consequences that past ANES practices have for your inquiries and scholarship. We are committed to do our best to avoid such mistakes in current and future ANES endeavors.

We view these discoveries as opening up exciting opportunities for new and important scholarship on political knowledge. Public opinion watchers thought they knew how knowledgeable Americans were about political facts, but now, the extent of public knowledge is less clear. We hope the new data ANES will release in the future will inspire scholars to conduct innovative explorations of public knowledge, perhaps yielding new insights into the extent and role of information in the realm of political behavior.

The report includes a long, somewhat complicated, but useful set of appendices detailing all the factors that play a part in this methodological fiasco. I'm not sure exactly what all this means, or even how to correct for it, or even to use ANES data for political knowledge research. Gotta sit, read, think -- and probably drink.

Thursday, March 27, 2008

CNN Knowledge Test

CNN's web site has a knowledge test of 10 questions.

I nailed eight of 'em. I pass!!!

That's all fine and good, but why the hell don't they have a way to see how others have done? I want downward comparison, to feel I'm smarter than all the other schmucks who visit cnn.com. It'd be fun to know how others did.

btw, I guessed on a couple -- but guessed correctly. I had no idea what actress had recently posed topless (yeah, yeah ... Hollander should know that) but I had seen her face on CNN one day but didn't catch the audio of the story. Thus, I guessed right.

Monday, March 24, 2008

Interest versus Coverage

The fine folks at Pew often compile numbers on what stories people cover, and sometimes they mesh it with what the news has actually covered. For you PhDweeb types, this sounds a lot like agenda-setting, the tired idea that the media do a terrible job of telling people what to think but do a helluva job telling people what to think about.

There's a nice match in this report between what the press covered and what people were interested in. Scroll down and see how media coverage is related to personal interst. The 2008 campaign is first in both, as is the Spitzer story. It's interesting that only 1 percent of coverage forcused on the death of the UNC student (who was from Athens, Ga., my town, a sad story), but 7 percent of Americans were interested. A slight disconnect.

Why does it matter what people think about?

It colors everything. It frames an election. It influences how you consider a candidate, a policy issue. Hell, it affect everything. Political knowledge is one thing, but how a campaign or issue is framed has more persuasive power. The PR folks know this. So do the advertising folks. And the pros who plot political campaigns.

People need to know it too.

Friday, March 21, 2008

Only Tyrants ...

I'm looking at some old books and their mention of political knowledge. Here's a great quote from Lectures on History and General Policy by Joseph Priestly, published in 1791:
Only tyrants, and the friends of arbitrary power, have ever taken umbrage at a turn for political knowledge, and political discourse, among even the lowest of the people.

This is defense of what people should know at at time when the "low" were considered incapable of playing a role. "Political knowledge, it will be said, is useful only to politicians and ministers of state," Priestly writes, then takes apart the argument.

Not all the old stuff is so thoughtful. Here's a favorite from Sketches of the History of Man by H.H. Kames in 1774:
The progress of political knowledge has unfolded many bad effects of a great city, more weighty than any urged in the proclamations. The first I shall mention is, that people born and bred in a great city are commonly weak and effeminate.

Gotta love these old guys.

Wednesday, March 19, 2008

Pay for Grades

Below is an abstract of a study I blogged about earlier. Basically, paying people improves how well they do in a test of political knowledge. Clearly they are motivated by the cash.

Okay ... so?

A fad of late has been paying kids to do better in school. This makes sense, especially now that I think of it in terms of the study below. If all the other ways of increasing motivation fail (family pressure, peer pressure, the hopes of a good job, the joy of learning for the sake of learning) then why not pay the little darlings to get good grades? In this era of accountability (i.e., no child left untested), this strikes me as a very American approach. Cold, hard cash. The invisible hand. Money talks.

I'm serious. Why not?

The study abstract follows.


Surveys provide widely cited measures of political knowledge. Do seemingly rbitrary features of survey interviews affect their validity? Our answer comes from experiments embedded in a representative survey of over 1200 Americans. A
control group was asked political knowledge questions in a typical survey context. Treatment groups received the questions in altered contexts. One group received a monetary incentive for answering the questions correctly. Another was given extra time. The treatments increase the number of correct answers by 11–24%. Our findings imply that conventional knowledge measures confound respondents' recall of political facts with variation in their motivation to exert effort during survey interviews. Our work also suggests that existing measures fail to capture relevant political search skills and, hence, provide unreliable assessments of what many citizens know when they make political decisions. As a result, existing knowledge measures likely underestimate people's capacities for informed decision making.

Monday, March 17, 2008

Knowing Iraq

In a new report:
Public awareness of the number of American military fatalities in Iraq has declined sharply since last August. Today, just 28% of adults are able to say that approximately 4,000 Americans have died in the Iraq war. As of March 10, the Department of Defense had confirmed the deaths of 3,974 U.S. military personnel in Iraq.

The proportion of people who could correctly name the number of military fatalities in Iraq has been as high as 55% but has never been as low as it is now. Iraq has left the front page, it's left the top of the newscast. We have Obamillary, we have the economy. Hell, we have March Madness.

Republicans dropped from 53% to 26% correct. Democrats dropped from 49% to 30%. In an odd twist, the older you were, the greater the drop in knowledge about military deaths. No idea why. The same huge drop is seen among higher educated folks than those of less education, probably in part because the less educated had a lower number to begin with. Basement effect.

The good news is, 84% know that Oprah supports Obama. That's news you can use.

Wednesday, March 12, 2008

Spring Break

Due to spring break, I'm not really posting or thinking of posting or doing anything that resembles posting.

Er, except for this post...

Thursday, March 6, 2008

Knowledge vs Emotion

In social science PhDweebdom, we call emotion "affect" just to confuse budding grammarians who struggle with the difference between effect and affect. The relationships between cognition (stuff you know), affect (stuff you feel) and motivation (what moves ya to do something) are complex, baffling, and altogether enough to drive one crazy.

So, why bother?

In my continuing saga of studies I'd like to see, I would love to know what people "know" about Obama versus what they "feel" about him, and which better predicts support. I have no partisan issue here, but I suspect affect plays a bigger role in support than does cognition. I'm not suggesting Obama supporters don't think, I'm just wondering whether emotion is a key factor, overwhelming cognition. I suspect it is.

In general, other than party identification, key factors in who you vote for at the presidential level are competence and integrity. Some lean one way, some lean the other. And individual elections can frame the race around one versus the other. A good example is 1976 and Jimmy Carter, a post-Watergate election in which integrity became the main factor. How people decide these two is a fascinating issue, and it's my non-scholarly sense that Democrats seem to lean more on competence and Republicans seem to lean more on integrity -- but that's more of a gut feeling than one based on hard data.

A fun test would be asking Democrats where they stand on issues, using the NES traditional 7-point scale, and then asking them to place Obama and Clinton on the same scale, and see where they fall. There's a neat web site that kinda does this, asking you a lot of issue questions and then telling you who you should support, the candidate most fitting your stance.

Didn't work at all for me, at least at the moment. Clearly I'm not cognitive.