Thursday, June 30, 2011

Knowledge and Support for Civil Liberties

It's an assumption that the more educated and knowledgeable a person is, the greater the support for civil liberties.  We see the same kinds of relationships between education/knowledge and tolerance.  The knowledge-civil liberties assumption is rarely tested because, well, it's an assumption.

A study in a recent Journal of Politics (volume 73, April 2011, pp 463-476) actually tests this assumption.  And finds it wanting.

Written by five folks from Yale and one (sympathy?) author from Michigan State, the study uses a field experiment at 59 high school classrooms to find that enhanced civics classes lead to greater constitutional knowledge as compared to students in conventional civics classes.  No surprise there.  But unlike previous assumptions and correlational studies, it also found no increased support for civil liberties.  As the authors write:
The findings imply that students can become more knowledgeable about the Constitution and workings of government without experiencing a concomitant shift in their support for free speech, dissent, or due process. This pattern of results is striking both because this experiment has ample power to detect even small effects and because, as noted above, the statistical test is biased in favor of finding a mediating relationship. The theoretical implications of this finding are profound. Evidently, it is possible to increase awareness and understanding of civil liberties without producing an increase in support for those civil liberties. This finding therefore calls into question the longstanding argument that beliefs and attitudes are causally linked in this domain.

To put this in somewhat less theoretical words -- oh crap.

The question of whether knowledge is associated with, or independent of, attitudes has been of some interest lately.  Approaches such as motivated reasoning point to this separation, finding that basically people believe what they want to believe, regardless of the factual nature of, um, the facts.

The results above are more complicated than that, but they do suggest very different avenues of cognition and belief.  Or, as the authors suggest, it may be that knowledge and attitudes are "casually disconnected."  As they say:
One cannot increase support for civil liberties simply by teaching students about the provisions of law that are designed to protect these liberties.

Wednesday, June 29, 2011

Twitter May Not Make Us Stupid, But ...

There is of course a silly argument semi-raging online about whether Twitter makes us stupid.  Of course it doesn't.  Making us dumber is the job of reality television, especially Jersey Shore.

But this does raise the question of whether exposure to Twitter is associated with greater political knowledge.

The answer?  No one knows.

Few if any studies directly examine the micro-blogging site Twitter and any potential political learning.  A number of studies have examined the motivations for reading blogs or Internet use among the young and participation or even the troubling consequeces of online rumors, which leads to misperceptions rather than knowledge.  So far I haven't seen any solid, systematic attempts to explore whether Twitter leads to greater knowledge.  They may be out there, but I haven't found them.

And so, like any good blogging academic, I'm gonna make stuff up. 

Or rather, I'll extrapolate what we know about media consumption and political knowledge to answer the question of whether Twitter makes us politically smarter, politically dumber, or (to not bury the lede) probably makes no difference at all.

The Dumber Argument

Twitter as social media automatically gets frowned on, just as television news did as it emerged to overtake print news consumption, and not without good reason.  TV news does suffer from delusions of adequacy, but we know now that watching news on TV actually leads to greater knowledge -- for those who have little prior knowledge, or who are less educated, or who are less interested in public affairs. 

In other words, TV news works best for those who don't know all that much in the first place. But what about Twitter?  The argument for dumber is based on a large part to the billions of stupid posts that litter the Twittersphere.  Never mind the good posts, the 140 characters-or-less comments and pointers to good journalism.  There is a time-wasting aspect to Twitter, certainly, and the dumber argument rests largely on the combination of stupid posts and time spent on a medium that presents itself in fragmented, oddly phrased posts.  It's not a bad argument, but it's wrong.  There is no data that suggests reading tweets makes you dumber or even shortens your attention span.  It does steal time from more meaningful content, but a counter to that is that many tweets actually point toward compelling content a reader may have missed otherwise.

The Smarter Argument

There is little data to support this argument either.  Some correlational data may suggest greater political knowledge is associated with using Twitter, but that's a function of the kind of people who choose to tweet or read tweets -- people who happen to already be news junkies.  Even statistical controls for various demographics may not capture all the factors that better explain what people know than would using Twitter. 

For the chattering class and those who follow the chattering class, no doubt Twitter is a boon.  But does it lead to greater knowledge? No.  For those folks, they're probably learning no more than they already know, which is probably a lot based on traditional tests of political knowledge.  Twitter is another tool, but it's not making anyone like to use it any smarter.  There's a ceiling effect here.  But what about Twitter users who are not news junkies?  Odds are they're not any smarter, politically, from the medium and indeed their social use of tweets would probably drive down the political knowledge scores of overall Twitter users.  Which leads us to the next category ...

Twitter Makes No Difference

This argument rests largely on the lack of data, the discussion above of how the news junkies have a ceiling effect on knowledge and social users might offset any high scores Twitter users get on tests of political knowledge, and an appreciation for a finely crafted null hypothesis. 

There are strong methodological reasons why exposure to mainstream news sources may actually be unrelated to political knowledge, in part due to measurement error associated with the kinds of exposure questions we ask in surveys.  I don't buy that particular argument, though it's loved by political scientists who tend to shy away from, or explain away, media factors in understanding politics.  The media have a modest but significant relationship with knowledge, but Twitter -- once you control for other factors and due to its unique audience -- will not have enough variance left to explain. 

In other words, Twitter will be found to be unrelated to political knowledge.  It doesn't make you smarter, it doesn't make you dumber.  At least not politically.  People who are drawn to Twitter to escape the news, those folks fled the news long ago.

Tuesday, June 28, 2011

We're All Brands Now?

I'm not sure I want to be a brand.  But I doubt that matters.

There's debate mini-raging based on Gene Weingarten's original Washington Post column in which he complains that "branding" is destroying journalism and that j-schools are "urging their students to market themselves like Cheez Doodles." Or, as he says:
We are slowly redefining our craft so it is no longer a calling but a commodity. From this execrable marketing trend arises the term you ask me about: “branding.”

It's this response by Mathew Ingram that nicely sums up the counter argument.  "We are all brands now," he says.  "So get used to it."  As Ingram says:

I hate to be the one to break it to Weingarten, but the journalism business as a whole is becoming a commodity in many ways. But it’s not journalists and media organizations that are redefining it as such, it is the market itself — and the fact that media is becoming something that anyone can do. The tools for publishing and becoming a “media brand” are available to anyone now thanks to blogs and Twitter and Facebook, and that has made the world of media and journalism a lot flatter, as NBC White House correspondent Chuck Todd noted in a recent interview with the Poynter Institute.

So we're all brands now.  Hell, this blog along suggests I am a brand, or at least I try to be a brand, though clearly not much of one.  Just look at my PeerIndex number and you'll see.  No one's buying.

But we've always been brands, if you take a broad definition of "brand" and think of impression formation and maintenance.  Yes, being a "brand" strikes many as unseemly and unsavory.  It suggests shameless self-promotion.  Bob Woodward is a brand.  So, sadly, is Perez Hilton.  Are you a brand even if you refuse to think of yourself as a brand?  I'm not sure.  The argument here is muddled by a lack of precision in what we mean by "brand" and whether brand is in the eyes of the beholder, or holder, of that brand name.

And what's this got to do with what people know?  Hold on, I'm trying to get there.

Given the nature of this blog, I'm not going to dive into the "brand is bad for journalism" argument other than to say that if you spend time building your brand, that's time you've not spent doing good journalism.  There are only so many hours in a day.  I also suspect, but can't prove, there is a negative correlation among academics who attempt to brand themselves via social media (er, kinda like me) and the scholarship they actually produce.  If I had time, I'd test this hypothesis by looking at tweets versus peer-reviewed publications.  I think I'm right.

So does the hypothesis above also apply to doing journalism?  Probably, to some degree.  Again, only 24 hours in a day.  But by building your "brand" as a journalist, you also open more doors, you gain more leverage, you gather about you the trust and followers among those who not only want to hear or read the stories you have to tell, but also want to share those with their friends.  Remember, recommendations by "friends" do matter to people, as new research shows, even more so than that person's feelings toward a specific news organization.  In other words, if I'm a conservative I generally ignore MSNBC, except friend recommendations, as one experiment shows, can trump this partisan preference.

Okay, I'm off track a bit.  Does it matter, all this branding stuff, to what people learn from the media?  Absolutely, though I don't know of any research that directly backs up this assertion.  My gut feeling, based on nearly 25 years of being immersed in social science research and more years than that as a journalist or journalism professor, is that trust matters in how people expose themselves to a news source and what they get out of that source.  Never mind the sharing part, which will also matter.  Let's look only at source->receiver relationships.  With more trust comes more careful reading/listening to stories, and with that should come greater knowledge about public affairs. 

In an indirect way, then, branding helps us the consumer of news find news we can trust, just like the New York Times is a brand, just like Fox News is a brand.  Individual journalists, especially columnists, have long been "brands" in every sense of the word.  Beat reporters are also a brand, though very localized.  Although I'm an old print newspaper guy (who uses the hell out of his iPad), I think we (the royal journalistic we) can balance our branding and our newsgathering and storytelling.  But -- branding alone, hollow shameless self-plugging -- will never replace good reporting and good writing. 

I hope people will see through that, and respond appropriately. 

Monday, June 27, 2011

News with a Partisan Slant?

I wrote a couple of days ago about a new Public Opinion Quarterly study that finds people who consume "likeminded news" (news that agrees with their partisan point of view) actually increases certain aspects of political participation.  In that piece I promised to review the study author's categorization scheme for which television news programs qualified as being slanted toward Republicans, Democrats, or neutral.  Here's my take.

First, let's recreate the list from the study's appendix (Table A1, p. 312, if you have access to POQ).  Remember these are 2008 programs, so some have disappeared.  I comment after each category, only briefly, then I'll get into details after the list on how the author assigned programs or networks to the categories.

Programs with a Republican Slant

The Beltway Boys
Fox & Friends
Fox News
The Fox Report with Shepard Smith
Geraldo at Large
etc etc., all of them Fox News programs...

Okay, I stopped here listing them because they're all Fox News shows, and I think most of us might agree with the slant (except Studio B with Shepard Smith, which to me is one of the more "straight" presentations on Fox.  So I quibble with that one).


Programs with a Democratic Slant

ABC Nightline
Anderson Cooper 360
BET News
CNN Headline News
The Colbert Report (really, it's in there)
Countdown with Keith Olbermann
The Daily Show (really)
Good Morning America
Hardball with Chris Matthews
Late Edition with Wolf Blitzer
MSNBC Live
Out in the Open
Situation Room
The View
This Week with George Stephanopoulous

As you can tell, a bunch of major networks and their shows.  Again I have quibbles, especially with Situation Room, which strikes me as a journalism guy with some small experience as fairly straightforward in its presentation. And it's amusing to see Steward and Colbert in there as news programs, but I think we can all buy into that as well.


And finally, Neutral Programs

ABC World News
CBS Evening News
America This Morning
CBS Morning News
etc etc. all three major networks and morning shows
Frontline
Larry King Live
Lou Dobbs
60 Minutes
McLaughlin Group
20/20
NewsHour
Reliable Sources
Meet the Press

Okay, we can quibble with the list above too.  Lou Dobbs?  Neutral?  McLaughlin Group?  All in all, the list above seems mostly right to me, but of course your first methodological question should be -- how the heck do you systematically decide what program or network goes in which category?  Good question.  The explanation, also found in the article's appendix, is complicated.  Let's walk through it.
  • What candidate does a program favor?  This was a follow-up question to respondents who reported certain media exposure.
  • Mix and Match.  "I took into account respondents' perceptions of partisan slant for both the program and its parent network," says the author, trying to assign a category to unmentioned programs or to further justify the categorization.
  • A statistical threshold of 25 percent was used to make sure the categories were firm.  More on this in the appendix.
  • A Lexis-Nexis search was used for rarely mentioned programs, using various terms like liberal or Republican to assign a slant to a program or its host.
This is some creative work, and I've only skimmed above the detailed effort to assign programs to what any reasonable, informed person might (or might not) agree with.  But it's important some systematic approach be used, not just gut feeling, when doing this kind of work. So I applaud the effort while quibbling with a few of the assignments.  After all, in large part we're talking above about general respondent impressions of a news organization or program and their slant, not the reality of its journalistic approach.  Given the constraints, this is an excellent stab at categorizing programs and networks.

The Power of Social Influence

As I teach a graduate class on social media this summer, I've skimmed or read deeply a large body of work on the effects of Twitter, Facebook, etc., on our social and political perceptions -- deciding whether to include them or not in class.  Today we're discussing a couple of studies that suggest a dramatic shift is taking place in how people organize their news consumption habits and the consequences on what stories they consume. 

One interesting point is this -- that picking your news by partisan predispositions (conservatives to Fox, liberals to MSNBC, and so on) is being shattered by social media.  A set of experiments shows that, instead, individual recommendations outweigh source preference.  In other words, if I'm conservative and I love Fox News, that's all well and good, but a recommendation from a "friend" via social media will trump that preference.

Given we tend to selectively expose ourselves to likeminded others -- people hang out with people like themselves -- this trumping of news source by friend recommendation may not mean so very much.  Or it may mean everything.  It's too early to tell, but it raises some interesting questions as we watch the media audience fragment along partisan lines.  Perhaps personal recommendations is a way to bring the news consumption universe back into order.  Or perhaps it'll make things worse.  Again, it's too early to tell, but my gut says a growing reliance on friend recommendations will only increase the likelihood we consume news that tells us what we want to hear.  And that's the bad news of the day.

Friday, June 24, 2011

Like-Minded News and Political Participation

A fascinating study published in the latest Public Opinion Quarterly asks a very simple question -- does consuming news you agree with make you more likely, or less likely, to participate in politics?

The answer?  More likely to participate.  But with limitations.

There is a growing body of work on the migration of partisans to news that generally agrees with their political point of view (as in, conservatives to Fox, liberals to MSNBC).  This study by Susanna Dilliplane uses national panel data and some sophisticated modeling to explore whether these trends lead to greater or less political participation.  It's terrific work (and yes, it cites two of my published studies, another big plus.  End shameless plug).

There are competing theories and findings as to whether partisan media have mobilizing or demobilizing effects.  People generally agree that exposure to competing view points is a good thing (see the work of Mutz, for example).  But does such exposure actually help, or hurt, when it comes to participation?
 
Using the 2008 presidential election as a test, the research finds "substantial support for the proposition that exposure to partisan news affects political participation, particularly behavior during the campaign."  For example, consuming likeminded partisan news increases campaign acitivity and encourages people to make an early decision about the election.  Consuming news that conflicts with your partisan point of view -- that can have the opposite effect.  So what is generally considered good, hearing lots of viewpoints, demobilizes you, makes you less likely to participate.


But, and this is a major but and therefore deserves special boldface and italic treatment, such partisan exposure is unrelated to actual voting.  This may be due in part to the use of within subjects rather than between subjects analysis.  After all, this is panel data, so you look at change within the same people over time.  That doesn't leave a lot of room, statistically speaking, to find effects like voting.  One hopes it's not a methodological artifact and partisan viewing/reading actually does not influence likelihood to vote.  One hopes.

On a methodological note -- how do you decide whether a television news network is likeminded or not? Yeah, we can all agree Fox or MSNBC is likeminded for conservatives or liberals.  Here the author does some nifty combining of data to categorize the media.  Nifty, though some might quibble in relying on public perceptions of a news organization's partisan leanings.  The appendix breaks these down for you.  Republican slant?  Fox News and nearly all of its specific programs (O'Reilly, for example).  Dem slant?  Colbert and Stewart show up here, as do MSNBC and its various programs, and CNN's Situation Room (I'd quibble with that one, a lot).  Neutral?  A bunch of stuff.  I'm going to blog about this categorization scheme at another time.

What's the takeaway here?  While we applaud the exposure to various viewpoints, it may actually not be good when it comes to people participating in politics.  The theoretical underpinnings of this finding are in the article so I won't go into detail here.  But it is interesting to note that exposure to more neutral new coverage shows hints here of also mobilizing folks.  That's good news for those of us who prefer a world with multiple viewpoints presented in news stories rather than a single, preached narrative by certain (ahem) cable news networks.

Wednesday, June 22, 2011

Playing with Wordle

What's the blog look like as a word cloud?  Here's Wordle's version:

Wordle: What People Know Blog 1

This probably only looks at the blog postings on the first page, but it's kinda fun.  Click on it, I think, to see a better image.

Dumb TV Shows Make You Dumber?

Without a reporter looking to interview someone about this research I would have missed it.  The study's main point? Watching dumb TV, wait for it, makes you dumber.  Shocked?

A story about the study is here.  And the journalism guy in me admires this lede:
Take note, fans of mindless reality shows like "Jersey Shore": New research suggests watching something dumb might make you dumber. In other words, you are what you watch.


That last line, for a lot of folks ... ouch.  This is basically a priming experiment, at least as described in the news story.  I'm having a hard time finding the actual study in Media Psychology, but as it's described subjects were randomly assigned to a control group or two conditions and read a script (important methodological note -- read, not watched) and then answered some political knowledge questions.  Those who read the "dumb" script, and you can read the story yourself for details, did less well on the political knowledge test. 

Priming is the theoretical underpinning here, and I can buy that.  It's not that people were made dumb by the script so much as they were primed to approach a test in a dumb frame of mind, to slip into a mode in which serious thought is set aside.  Reading a dumb script, and we can assume watching a dumb show, somehow slips our mind into neutral and deeper thought gets shoved aside.  This is not unlike watching a smart program or listening to classical music and then perhaps doing better on a test not associated with either.

Keep in mind, though, that priming effects are short-lived, or as we say in social science, the effect quickly erodes over time.  So you'd need a constant Jersey Shore fix to keep that dumb frame of mind going strong.  Then again, anyone who watches that program in the first place probably already qualifies as not exactly a rocket scientist.

A reporter from Atlanta may call me to discuss this, which will be kinda fun if it happens.  How often do you get the chance to make fun of Jersey Shore?  Not enough, that's my position.

Tuesday, June 21, 2011

Fox Viewers Misinformed?

Jon Stewart visited Chris Wallace of Fox News and said the following:
Who are the most consistently misinformed media viewers?  The most consistently misinformed? Fox, Fox viewers, consistently, every poll.
Love your work, Jon.  And I sympathize with your frustrations.  But it ain't so. Politifact even fact checked Stewart and found out -- it ain't so.

But Politifact gets it kinda wrong too.

There is misinformed.  There is uninformed.  I think Stewart really meant what he said -- misinformed -- while Politifact appears to focus on a different concept -- uninformed.

Allow me split some conceptual hairs here.

Uninformed is fairly straightforward.  People do poorly on tests of political knowledge.  Politifact notes that while Fox viewers in general don't do particularly well on such tests, viewers of specific Fox programs, such as The O'Reilly Factor, do just as well as Stewart's own audience for The Daily Show.

Misinformed, though, is different.  For example, Fox viewers back in ancient times believed there were weapons of mass destruction in Iraq even when it became obvious to the rest of the planet that no, there weren't any, not for quite a while.  I think this is what Stewart meant, that Fox viewers are misinformed because of the slant, the angle, the conservative bent and narrative of the network even in its "straight" news.  That's being misinformed, not uninformed.  Even Politifact admits this by saying: "This study is probably the strongest support we found for Stewart’s claim, in part because the difference between Fox and the other news outlets was so stark, and in part because the questions asked have pretty clear-cut "right" and "wrong" answers."

As much as I respect Politifact, score this one for Stewart.  The truth-o-meter should not read False. 

IQ Tests

For some reason I've never written much about IQ tests.  I vaguely recall taking them as an elementary school student, though of course we never heard our actual score -- though I suspect the behavior of the nuns at Sacred Heart School toward me can be explained by my unimpressive score.

Okay, enough about me.  IQ tests are an interesting subset of what people know, more like how people think, and I point to this Ten Interesting Facts about IQ Tests mainly because the folks there were nice enough to contact me about them. 

A couple of the points really do kinda sorta fit how we learn about politics and the media.  For example, IQ tests measure only "certain intellectual skills."  We talk a lot about the role of cognitive ability in persuasion and information processing.  It's a key factor that often helps decide how we process information, which in turn can influence the consequences of such information (i.e., whether we're persuaded).  Intellectual skills probably plays into this.  In a related sense, "environmental factors" can play a role in IQ.  From a media processing perspective, this is absolutely true -- from socio-demographics to where you are when you hear/read the news to a host of other factors.

And my favorite:  IQ tests are fallible.  I only with the nuns at Sacred Heart had read that one many years ago.  But on a methodological point, researchers need to keep in mind that often our questions are often little better than blunt instruments designed to get at deeper concepts.  Education is often used as a surrogate of cognitive ability, but I think we'd all agree it's far from perfect.  It's just the best -- and easiest -- thing we have in a survey scenario.

Friday, June 17, 2011

Don't Know means, apparently, Don't Know

In the latest Journal of Politics, Robert Luskin and John Bullock examine whether "don't know" as an available response to questions about political knowledge can affect the results.  The study itself is here (assuming you have the same access as I do).

In survey research, we often talk about DK (don't know) and how to present this option, and whether presenting it in a certain way encourages people to take the easy road and simply say they don't know versus coming up, with a little more mental effort, an answer.  This matters, at least to those of us who study the knowledge of the American electorate.  Encouraging "don't know" as a response can lead to a more disappointing portrait of the public's knowledge.  Or, to flip it, discouraging this response would paint a better picture.

The authors used to national survey experiments to explore whether discouraging DKs will matter.  As they write:
Discouraging DKs does paint a more comforting picture of the public’s knowledge of politics—but, as the foregoing shows, only slightly so in the openended case and spuriously so in the closed-ended one. Anyone searching for large caches of hidden knowledge, it appears, should look elsewhere.
Given most of the time we rely on close-ended questions to tap into the public's political knowledge, the results basically tell us that fiddling with "don't know" as a response, usually by discouraging it or not offering it as an easily available response alternative in a survey scenario, won't really improve the results.  Indeed, the authors argue that DKs should not be discouraged.  The only question, they argue, is whether to encourage them.

Yes, this gets all manner of methodologically geeky, but for those who study what people know, and who craft surveys or other instruments designed to measure it, these results are important.

Full journal cite: The Journal of Politics, Vol. 73, No. 2, April 2011, Pp. 547–557.

Thursday, June 16, 2011

A few quick mentions on a busy afternoon, just to feed the blog and give you something to consider or to point you in the direction of some interesting stuff.
  • There's a huge new Pew Center report on social media out today.  I'll be reading it this weekend for a discussion Monday in my Consequences of Social Media grad seminar (thank you Pew for the perfect timing).  It's an 85-page pdf, so it'll keep the students busy and off the streets this weekend.  And it does a nice job of summing up the stuff we've been discussing all week and will discuss next week.  I may discuss this more next week if any of it fits my blog theme.
  • More to the point, there was a story in the NYTimes the other day about how poorly U.S. students do on tests of history.  Indeed, as the lede says, "American students are less proficient in their nation’s history than in any other subject."  That alone is enough to make high school history teachers sob themselves to sleep.  Only 12 percent of U.S. high school seniors demonstrated proficiency on the exam, and the story comes complete with the startlingly low correct responses to the traditional questions.  Read it and weep.
  • Here's an interesting opinion piece out of Canada (and how often can you write those words?) in which the author states: "I want to challenge the myth that so-called "real" news is democracy's oxygen while "infotainment" has the deadly effect of carbon monoxide."  The argument is straightforward, and one I'm sympathetic to -- that news must engage the audience, else it exist in a vacuum and become, for many, irrelevant.  Some call this dumbing down, some call it marketing.  But there's no doubt the royal we of journalism are damn good at making the important brainnumbingly dull.  It's an old argument, the prescription model of news (here's your medicine, I know it tastes bad, but it's good for you) and the sugar-filled model of news (all celebrities, all the time).  The days of the former are over.  Now we just need to strike a meaningful balance in order to both make money doing journalism and fulfill are obligations to democracy and an informed electorate.

Tuesday, June 14, 2011

What the Public Knows ...

Only 38 percent of U.S. adults know which house of the legislature the GOP controls, according to a Pew report (full report here).  Graph below.

Monday, June 13, 2011

Changes Over Time

Do people continue to care about politics and public affairs compared to years ago?  Has their trust in government changed?  And do they think they can make a difference?  Here are few tables, generated thanks to the ANES web site, to get at some of these questions.  In other words, I'm gonna cut and paste.

How about general public interest.  Below is a graph of changes over time:


As you can see above, political interest increased during the 1960s and then dropped thanks to a Watergate effect in the 1970s and into the 1980s.  It's remained largely constant, with a bit of a random walk year to year, really since about 1980.  So we can't argue that the public's interest has waned.  Then again, it's not all that high either.  These are the folks who responded with very high interest and the numbers sit about one-quarter of U.S. adults.

Okay, how about trust?    Glad you asked.

Trust dropped significantly during the 1960s (Vietnam, race) and the 1970s (Watergate), inched up in the 1980s (Reagan, Morning in America), dropped again in 1994 (GOP takeover), began a nice increase during the 1990s and 2000s and then dropped like a rock in recent years (the Obama effect?).  You can come up with any number of other explanations for these shifts up and down and all around, but my favorite would probably be that pounding against government, from left or right, along with real-world events combine to damage the nation's reputation with the public.

Another concept we're often interested in is efficacy.  That is, do people think they can make a difference.  The graph below captures that by flipping the data to describe those who do think they can have an effect on politics.
  
As you can see above, the question has to do with politics being "too complicated" for one to understand.  These are the "disagree" folks above, so they represent people who feel they can understand and make a difference.  The proportion of these folks decreased during the same period that trust went down, but then remained relatively steady until recent years when there's a bit of a jumping up and down.  Smoothing the data by looking only at presidential election years takes some of this randomness out.  In general, efficacy seems to be slightly decreasing of late.

What all this mean?  Recent years both look a lot like previous ones but there are also significant changes.  If you go to the site, linked above, and play with graphs you can see for yourself across any number of variables and concepts, from attitudes about specific issues to voting.  A lot of useful stuff here.

Thursday, June 9, 2011

Obama is Muslim

I've published on the misperception that Barack Obama is Muslim (Journal of Media and Religion, 2010).  While searching the net I came across this paper, much more sophisticated than my own, that attacks the same question using implicit and explicit measures of racism and the statistical magic of structural equation modeling.  I find the following very compelling:
Predispositions such as ideology, partisanship, and even race affect how individuals feel about Obama. This evaluation, in turn, motivates individuals to believe misinformation about the President, which creates implicit associations between Obama and Islam in long-term memory. Finally, these automatic associations increase the likelihood of perceiving and explicitly stating that Obama is likely a Muslim. Interestingly, political sophistication mitigates explicit associations, but it has no effect on implicit ones.
There's a lot to like about this study, the top reason being it cites me, but it's also quite complex in its methodological approach.  Some might quibble with the measure of political sophistication, which to me looks a whole lot like knowledge.  Here's the endnote on this point:

The political sophistication scale (M = 0.57, SD = 0.26; KR20 = 0.70) consisted of correct responses to the following items (correct answers and proportions in parentheses): 1) Responsibility to determine constitutionality of laws (Supreme Court; 74%); 2) Harry Reid’s job (Senate Majority Leader; 28%); 3) majority needed to override presidential veto (2/3; 64%); 4) more conservative party at national level (Republican Party; 92%); 5) current number of Supreme Court justices (9; 49%); 6) Hillary Clinton’s job (Secretary of State; 63%); 7) Constitutional authority to declare war (Legislative branch; 51%); and 8) name of current Supreme Court Chief Justice (John Roberts; 34%).
As you can see above, it's basically a measure of knowledge.  There's some debate, not really touched on in this paper, about how exactly one conceptualizes and measures sophistication.  Knowledge is adequate but, to me, not sufficient -- it fails to capture motivation as well as ability (indeed, a standard surrogate for motivation is political interest, also missing in the model).  Then again, I'm a mass comm guy, so I'd like to see news exposure/attention built into either the sophistication index or as a standalone variable.

But political scientists hate media variables.

When it comes to news, however, they're not above discussing the media even if they're necessarily included in their models.  Near the end of the piece comes this bit:
Many partisans tacitly and sometimes explicitly support misinformation by publicly questioning Obama’s faith, or at a minimum fail to correct those who do not. And since, Islam has been portrayed negatively, particularly since the September 11th attacks, the media share at least some of the blame, as many news outlets linked Muslims with terrorism. Some radio and television commentators were direct in drawing this association, while others were more subtle, inferring such a relationship through coverage omissions or photographic selection of terrorists such as Osama bin Laden (Jackson 2010).
So the media can have an impact.  We're just not going to study it.

Yeah, snarky. I don't mean to sound condescending.  This is an excellent study, methodologically above anything I'd try (then again, I have two brothers-in-law who are world class PhDs in biostatistics who call structural equation modeling little more than smoke and mirrors). It's a great piece of work, saved to my appropriate folder as I wrestle with an Obama-birther study.

And yes, that one includes media variables.  End snark (for the moment).

Wednesday, June 8, 2011

Objective Journalism -- Here to Stay?

The last thing I want to get into is one of those tired debates about objectivity in journalism (it's an approach to information gathering, not a verb to describe journalists themselves), but this ABC News piece caught my eye today.  I strongly recommend listening to the audio rather than reading the story (and how often will you find me, Mr. Print Guy, suggesting something like that?).  The author of a study and book, Prof. Ron Jacobs, says there is a "push back" against opinion-style stuff that poses as journalism. One hopes so.

Let's assume for the moment there is a push back against the Foxification of news.  That's a big assumption, but let's go with it.  What are the consequences in terms of what people know versus what people feel or think?  The easy answer is we'd have more fact-based instead of emotion-based attitudes and opinions.  That's the easy answer, but I don't know if that's necessarily true.  The American public has never been terribly consistent in its political beliefs and it's hard to imagine the media make all that much difference, or at least you'd think so looking at most of the political science literature, which tends to view the news and entertainment media as relatively unimportant in the grand scheme of things.  Indeed, there's a strain of political science research aimed at downplaying the role of news exposure/consumption, either on methodological or conceptual grounds, in part because the dominant model still relies heavily on party identification and associated concepts.

Back to the supposed push back. It's hard to identify from which segment of society this might emerge.  The chattering political class, those high in knowledge but also in partisan identification?  No, that seems unlikely.  The great unwashed middle part of America?  Maybe.  One would hope so.  But if that were the case, CNN would be improving in its audience numbers at the expense of MSNBC and Fox.  So far, that's not happening.  Hopeful thinking on the part of certain elites?  Probably yes.  Count me among them, except that I hardly qualify as elite.

To be fair, I'm relying on a brief ABC bit and I've not read the book, so until then I can't really truly comment on the premise in any detail, other than to say -- I hope it's right.

Tuesday, June 7, 2011

What People Read (or Buy, or something)

Thanks to colleague Karen Russell for pointing this out to me on Facebook, and colleague Karen King for her comments.  In a desperate need of something to blog about, I shamelessly lift this from my FB wall and plop it here.  In it,this New Yorker piece briefly discusses Amazon.com's analysis of cities and book buying.  Here's the list below.

1. Cambridge, Mass.
2. Alexandria, Va.
3. Berkeley, Calif.
4. Ann Arbor, Mich.
5. Boulder, Colo.
6. Miami, Fla.
7. Salt Lake City, Utah
8. Gainesville, Fla.
9. Seattle, Wash.
10. Arlington, Va.
11. Knoxville, Tenn.
12. Orlando, Fla.
13. Pittsburgh, Pa.
14. Washington, D.C.
15. Bellevue, Wash.
16. Columbia, S.C.
17. St. Louis, Mo.
18. Cincinnati, Oh.
19. Portland, Ore.
20. Atlanta, Ga.

The article's author, Macy Malford, offers some interesting explanations for the data -- which can in some instances make perfectly good sense (Cambridge, Mass., Berkeley, Calif.) and at times puzzle to the point of distraction (Miami, Fla., Orlando Fla.).  Keep in mind these are per capital results, thus they control statistically for population size. There are any number of good explanations for why Atlanta makes the Amazon book-buying list, but New York City does not.  Great available bookstores is a good one.  I

Here's where the confusion comes in:
  • When we think of books we automatically think of fiction, perhaps even serious fiction, but this Amazon list is all books.  Boulder, Colo., for example, leads in books in the Cooking, Food, and Wine category.  Alexandria, Virginia, is tops in children's books, therefore pushing it high on the overall purchase list.  For all we know, Atlanta may make the list thanks to Christian fiction or non-fiction books.
  • Measuring per capita makes good sense.  It does control for population size.  We do this in crime statistics too, such as murders per 10,000 people, so apples are compared to apples, oranges to oranges, and murders to murders.  But you can screw the pooch here.  Where do you draw the lines for "Atlanta" makes all the difference in computing crime (and book sales) statistics.  Even the pros who do this for a living make mistakes.
  • This is a direct measure only of buying some type of book (good or crap) from Amazon.com.  Not reading books.  So the good bookstore argument above may confound the results.  The only way to get at "reading" versus "buying on Amazon" is to use survey data that asks specifically how often you read a book.
All in all, I love lists.  They give us something to argue about, or to make fun of (Madison, Wisc., where are you?  And where's Athens, Ga.?).  But as I often preach on this blog when talking about media and political knowledge, it's the methodology that gets you most in trouble. 

Amazon calls this a measure of the "best read cities."  It's not.  It's a measure of "most Amazon-buying cities."

Monday, June 6, 2011

Summing Up

Here's a nice blog post that sums up research about what people know.  It's quite insightful about the standard text and how the data is a bit old and there are significant questions about how we measure political knowledge.  On a day when I'm actually running out the door and out of town, it's a good item to point to rather than come up with something useful myself to say.

I might, if time allows, quibble tomorrow with this graph:
One final point. People who read about politics in the newspaper know more about politics than others, but only when it comes to a particular area: the people and political parties involved in politics. In other words, reading the newspaper doesn't seem to help people understand how politics works or the substance of political issues. As Delli Carpini and Keeter point out, knowledge and understanding of politics depends not only on the will to learn about politics and the resources to achieve that goal, but also the opportunity to learn about it in the first place. 
 But that's for another day, when I'm not traveling.

Friday, June 3, 2011

Sarah Palin -- History Buff

When we speak of what people know (or don't know), Sarah Palin has her own special category.

Women Don't Know No Politics

It's a fairly standard, and often undiscussed for obvious reasons, finding in political knowledge research -- women do poorly as compared to men in tests of what people know.  I've blogged about this before.

I bravely return to this topic.

Why?  Because there's a study published in Journal of Politics that attempts to explain why gender differences occur.  It uses a strategy I've blogged about before, one previous studies have used, to help explain why women tend to do less well than men.  A number of theories have been posited to explain this gap.  Men guess more than women on such questions and the way we construct a political knowledge index tends to score "no answer" and an incorrect answer the same.  In other words, a of their few guesses will be right, thus giving men the advantage versus women who if they don't know, will simply say they don't know.  But a more plausible explanation is the kinds of questions we ask.

Or, as Kathleen Dolan writes in her JOP piece:
When we include political knowledge measures that ask for information on the present state of women in American politics, we see women’s traditional gender disadvantage wiped out. Women in the survey reported here hold as much information about women’s place in politics as do men. This offers support for the notion that knowledge levels are, in part, a reflection of the content of the items we employ.

To put it simply, methodology matters. Unfortunately there are no media variables in the study and I'd be curious to know if exposure to the news plays any role in creating differences, or evening the results, depending on the kinds of questions one asks to men and women.  It's an interesting question, one worth exploring.

Wednesday, June 1, 2011

That Facebook Guy vs the Speaker of the House

We all use Facebook, and apparently more young people can name the guy who created it than can name the Speaker of the U.S. House of Representatives, so says this new Pew survey

In all fairness, Mark Zuckerberg did have a movie made about him.  John Boehner?  Well tanned, he runs the people's house, and he tears up occasionally -- not the same as a major film or 500 million "friends."

The results are interesting, especially the age breakdown (see below).  Among 18-29 year olds, 63 percent correctly identified the founder of Facebook (Zuckerberg).  Only 21 percent in that age bracket correctly identified the Speaker (Boehner).  So let's jump to the 50-64 year olds.  Among that illustrious group (um, mine), 58 percent knew Zuckerberg (not half bad) and 58 percent knew Boehner (respectable).  In other words, older respondents have a more well-rounded knowledge, or at least that's my interpretation and I'm sticking to it.

A Washington Post blog discusses the results, with writer Scott Clement suggesting it has to do with education and interest.  As he writes:
Young people, on average, have lower levels of education, are more likely to identify with the Democratic Party and are generally less interested in politics — all of which are linked to lower levels of political knowledge generally and, at least for education and partisanship, knowledge of Boehner in particular. Wide awareness of Zuckerberg among young people may be due to the fact that 86 percent of Internet users ages 18-29 use social networking sites.
Plausible hypotheses, to be sure.  Ability (education) and motivation (interest) play key roles in predicting political knowledge and, in general, young people tend to score lower on tests of political knowledge.  A less likely hypothesis, also raised by Clement, is the role of party affiliation.  Yes, young people may be more likely to be Democrats, but I doubt they'd do any better identifying the Senate majority leader (a Dem) than they were in identifying the Speaker of the House (a Republican).

The full table is reproduced below.