Sunday, January 31, 2010

Republicans Smarter than Democrats?

Republicans outscore Democrats in political knowledge, according to this Pew report.  However, this comes with some caveats:
Republicans, on average, answered one more question correctly than Democrats (5.9 vs. 4.9 correct). These differences are partly a reflection of the demographics of the two groups; Republicans tend to be older, well educated and male, which are characteristics associated with political and economic knowledge. 
Okay, so maybe if you control for age, education, and sex, then all will be equal?  Not necessarily, the Pew folks say:
Still, even when these factors are held constant, Republicans do somewhat better than Democrats on the knowledge quiz.
In other words, Republicans are smarter even if you hold age, education, and sex constant.  But what if you hold constant, statistically, a host of other factors that may also explain what people know?  There are some socio-demographic factors you can toss in the model (race being one politically incorrect one, income being another) that will also explain some of the variance, probably enough to even out partisan differences.  But you have to ask, what's the point?  If these political and social differences are what make up partisan alignment, then controlling for those very same factors to "even out" the differences doesn't make a lot of sense.  Let's just agree that the people who consider themselves Republicans score higher on tests of political knowledge than those who consider themselves Democrats.

Below is one table from the Pew report.  Go to the link above to see the entire report.

Thursday, January 28, 2010

State of the Union Polling

President Barack Obama apparently did well in Wednesday night's State of the Union address, at least according to the polls.

For example, a CNN poll had 48 percent of those who watched saying they had a "very positive" reaction to the speech and 10 percent a "somewhat positive" reaction.   Twenty-one percent had a negative reaction.

Ya gotta figure more Obama supporters than detractors watched, but probably not a whole lot more, so the 21 percent seems a reasonable negative number for what seemed to be a good speech.  He could have given the Gettysburg Address and that 21 percent would still have responded negatively, so let's put all of this in persepctive.  What people think about a speech is informed largely by what they think about the guy giving the speech.  Predispositions are difficult to overcome, but Obama was aiming this speech not at the Limbaughites and Hannitites of the world but rather at that huge middle group of "independents" he -- the the Democrats -- so desperately need to hold onto to continue governing (and not get trounced in the 2010 elections).

In a different poll by CBS News, eight out of ten Americans approved of the proposals Obama floated in the speech.  This was an "instant" poll online, so take it for what it's worth.  Full text of poll results here.

Want a speech effect in what people perceive?  Fifty-seven percent before the speech thought Obama shared their priorities for the country.  After the speech?  Seventy percent thought so.  You find the same increase in agreement with, or support for, Obama among speech viewers.

Will this short-term gain hold?  Usually not, but the Obama folks have to be happy with the results.

What People Know about Science ... kinda

There's an abstract about a massive study of 10,000 or so undergrads over 20 years and what they know about science and the relationship knowledge has with attitudes.  Unfortunately the full study is not available and the abstract doesn't tell us a helluva lot.

Here's a bit of interesting methodology, or analysis strategy if you like:
Responses to belief questions were categorized, using theoretically derived categories, remodeled and confirmed through factor analysis, into five main categories; belief in life on other planets, faith-based beliefs, belief in unscientific phenomena, general attitude toward science and technology, and ethical considerations.
Okay, I can buy this.  And then:
Analysis revealed that demographic information explained less than 10% of the overall variance in students’ forced-answer scientific literacy scores.
That's a bit surprising.  In plain English, it means basic demographics like sex or age or race or whatever they included didn't really separate those who know a lot from those who know a little.  Then again, we're talking not about a general population where there will be huge demographic differences but rather students, who may differ, but let's face it -- they're not real people.

And the big result:
We present how students’ beliefs in these categories relate to their scientific literacy scores. 
Argh!  I'd love to know how they relate, but apparently you have to either wait for the movie or attend the conference where this is being presented.

And I've never seen this before in an abstract:
Stop by our poster and fill out a new survey that will give us important parallel information to help us continue to analyze our valuable data set. 
Weird.  Makes sense, just never have seen someone do this in a research abstract.  May have to try it myself some time.

Wednesday, January 27, 2010

America's Most Trusted Name in News -- Fox News?

A survey reports that Fox News is the most trusted of the television news sources.
Our newest survey looking at perceptions of ABC News, CBS News, CNN, Fox News, and NBC News finds Fox as the only one that more people say they trust than distrust. 49% say they trust it to 37% who do not.
Not sure why MSNBC is excluded.  Or PBS.  Or Comedy Central with Jon Stewart or Stephen Colbert.

For years CNN has run that "most trusted name in news" tagline so now I have to wonder whether they'll rethink it based on this survey.

But, the methodology says:
PPP conducted a national survey of 1,151 registered voters on January 18th and 19th. The survey’s margin of error is +/-2.8%. Other factors, such as refusal to be interviewed and weighting, may introduce additional error that is more difficult to quantify.
Not sure why you'd want registered voters when asking about trust in news, since many people watch the news who are not registered voters.  Curious, or not so curious, given how "registered" voters tend to skew a bit toward more of a Fox News demographic.  Even so, in fairness, Fox is kicking butt so given its audience size you'd expect it to win this contest.  A better way would be to collapse all three broadcast networks and see how they stack up.

The actual news release, in annoying pdf format, is here. It's an automated poll, as you can see if you scroll down the pdf file I listed above, so the findings may unravel in a more traditional, systematic survey.  Also you find some interesting crosstabs.  Key points I found:
  • CNN wins among moderates.  It also does better among women than men, but Fox beats 'em with both categories.
  • Fox beats CNN among whites but not in any other racial/ethnic category.
  • In the major age groups, Fox beats CNN among 18-29 year olds and 46-55 year olds, but CNN wins among those in the middle group (excluding here the major networks).
JUST ADDED -- a really good analysis of this poll by the ABC polling guy, Gary Langer, is here.  Worth the read.  He raises all kinds of points I wish I'd thought of, damn it.

    Tuesday, January 26, 2010

    Online Learning

    Online learning of higher education (college) has increased 17 percent in a year, according to a new report.  I find this interesting for a number of reasons.  First, I teach at a traditional land grant university and second, I spend a lot of time online and thinking about how online will change journalism and what people know.  This study is about learning on demand, part of the larger entertainment on demand that the Internet has made so much a part of our lives.

    People want it -- it being pretty much anything these days, from information to entertainment to pizza -- when they want it and not when we want to give it to them.  More and more, this will include education.

    Earth Exploding?

    My favorite quote so far this day.  Context: The National Enquirer of supermarket tabloid fame is submitting its John Edwards stories for a Pulitzer Prize.  Thus spoke the executive editor of that fine publication:
    I think the members of the mainstream media would rather see the earth explode first than to reward us with a Pulitzer Prize. 
    Yeah, that about sums it up.  Hell freezing over.  Earth exploding.  Glenn Beck awarded sainthood.  Any of these seem reasonable alternatives.

    Monday, January 25, 2010

    Census Redux

    I discussed a few days ago a "census" document I received in the mail from the Republican Party.  Below is how the envelope looked:











    Obviously misleading, suggesting that it was a Census document when really it is a mailing to people who have voted in the past for Republican candidates (yes, I have).  The survey itself is pretty much what you'd expect.  Below are small images of the four-page survey.  You can click on any page and see a better, but not great, image. 

    In terms of quality -- setting aside the ethically-questionable and legally-tricky matter of posing as a real "census" in a Census year, the survey is okay.  I blogged before about Question 7 and how it collapses media and asks about "Internet Blogs" as if there are blogs other than on the Internet.  There are no questions that strike me as terribly loaded given the source of the survey.  You'd expect this point of view from one party about the other.  And of course the final page includes all the ways I can give money, if I so choose (er, I don't, thank you very much, given six days of furloughs by the State of Georgia).

    Is it illegal to pose as a Census document?  Yes.  Does this violate the law?  Nah, probably not.  Sleazy, perhaps, but not illegal.

    The entire questionnaire, for your enjoyment:



























    Saturday, January 23, 2010

    Wow

    So I blogged a week or so ago about this semi-bogus "census" mailing I received that was really a survey by the Republican Party (yeah, I vote GOP sometimes, Democrat sometimes, I'm basically a radical moderate).

    Someone on AAPORNet mentioned Friday a similar mailing and I posted the image of the one I received on the blog and pointed to it on the listserv.  Wow.  I just checked the Google Analytics for Friday and my number of hits for a given day increased 20-fold.  Freaked my analytics out.  Here's a map that shows where all the hits originated for that page.
    Again, wow.  Too bad I didn't have Google Adsense turned on.  Might have made some money.

    This is a good demonstration of the power of interconnectedness and networks.  If I'd put it on my Facebook and Twitter accounts as well, I might have generated even more traffic.

    Friday, January 22, 2010

    Hell Freezes Over

    The National Enquirer as a Pulitzer Prize contender?  Yep, says Howard Kurtz of the Washington Post in today's column, for its groundbreaking coverage of the John Edwards mistress mess.  Ironic twist, this week Edwards admitted the woman's kid was his.  People were shocked!  shocked!

    Is it a likely winner?  Hell might freeze over, but global warming will come to the rescue. 

    The rag won't even earn a finalist spot, in part because of all the crud they run, in part for their reporting methods, in part because frankly this is not the kind of story that wins a Pulitzer.  Dick Cheney destroyed this guy ages ago in their VP debate.  The Enquirer was just piling on.

    Knowledge about the Census

    The fine folks at Pew have looked at what people know about the upcoming Census.  It's one of those good news-bad news results.  An early first blush at the findings includes the following:
    The survey also probed knowledge of some basic facts about the census. Most Americans know that the census is used to decide states’ representation in Congress (64%) and that the census is not used to locate illegal immigrants so they can be arrested (68%). But just 31% know that participation in the census is required by law.

    So fewer than a third realize that the law requires their participation, but the other two knowledge questions demonstrate pretty good numbers.

    Thursday, January 21, 2010

    Campaign Finance Limits -- Killed

    The U.S. Supreme Court just overturned a law that keeps corporations from using money to pay for campaign ads.  NYTimes story here.  Pdf of decision here.

    Why connect this to a blog about political knowledge and what people know?

    As I've discussed a length, as people flee news in a preference for more entertaining fare, it becomes more difficult to inform them of the issues of the day, or those in a political campaign.  Advertising, therefore, becomes even more important in "informing" voters, and now with this decision, corporations can flood the airwaves with advertising to swing elections.  A more misinformed voter is certain to emerge.  The Supreme Court's take may be legally correct here, but the outcome is not going to be as pretty as one might hope.

    Flacks and Flak and Journalism

    I love The Atlantic.  I get an actual hard copy in the mail every month -- ya know, the kind with ink on paper.  Brief piece out on what'll happen to journalists fleeing a dying biz, often into PR, and whether this a good thing.

    Wednesday, January 20, 2010

    Blogging as "Intellectual Contribution"

    Just for the hell of it, I mentioned my writing of this blog on my annual report to my department head and dean of stuff I accomplished in 2009.  I listed What People Know under intellectual contributions.  False advertising?  Oh yeah, you bet.  Then again, last year under my plans for the year I listed my resolve to exercise more, which tells you how seriously I take these kinds of bureaucratic tasks.  On a more positive note, I did get to list a manuscript I just had accepted at the Journal of Media & Religion (with only minor revision).  More on this later.

    Skimmers and Scanners

    Further evidence people skim and scan the news?  According to this E&P piece, new research shows that an increasing number of people visit news aggregators like Google News and 44 percent rely just on that bit of info and don't visit the actual source of the news -- usually a newspaper-based site.

    Most of this is a business issue between those who produce the news and those who do little more than point to it, but from a what people know perspective it is further evidence that we simply do not process the news as deeply as we once did.  This can have profound effects on political knowledge and a deeper understanding of local, national, and world events.  Or you can take this away -- it's all part of our shift from knowing, to knowing how to find stuff.  Two different takes, but neither are promising when it comes to social and political judgments.

    Saturday, January 16, 2010

    Veggie Tales

    Like 4 million or so other suckers, I tuned in to Iron Chef a couple of weeks ago when four chefs went at it using veggies from the White House garden.

    Except, they didn't actually use those veggies.

    According to this story, the Iron Chef folks used stunt veggies and misled the audience.  What we thought were radishes and stuff pulled from Michelle Obama's garden were instead probably pulled from the produce aisle of the nearby Kroger.

    Throughout the show the chefs talked about how great the veggies were from the WH garden.  Yeah, maybe, but for some reason they had to mislead us throughout the commercials for the competition and the program itself.  I feel like I've been used with a really big squash.

    Below, their own Youtube bit:

    Friday, January 15, 2010

    Haiti

    When it comes to what people know about the disaster in Haiti, you have to hope they're not getting their info from crazies like Rush "Don't Help The Suffering" Limbaugh and Pat "Everyone but me is the Anti-Christ" Robertson.  An article in the LATimes discusses how CNN has nine journalists on the scene while that cutting-edge "talk about news but don't actually report any" Fox News has a single guy.

    Thursday, January 14, 2010

    Census Mail -- Sorta




    In my mail yesterday was a white envelope with black letters on the right side that said

    DO NOT DESTROY
    OFFICIAL DOCUMENT

    Holy crap?  What official document?  Through a piece of cheap clear plastic I can read:

    Census Document Registered To:


    Mr. Barry Hollander
    blah blah address blah

    Holy crap?  The Census has already started?  Yeah, it's 2010, but no way they're far enough long to be sending me mail.

    Look harder:

    SPECIAL NOTICE: You have been selected to represent Republican voters in Georgia's 10th Congressional District.  Enclosed Please find documents registered in your name.

    And larger:

    DELIVER EXCLUSIVELY TO Mr. Barry Hollander

    In other words, a Republican survey because, yes, I occasionally vote Republican so I'm on their list.  The survey asks how often I vote, what party I prefer, my age, and "from what media source do you regularly receive your political news?"  Bunch of choices here, but how they're grouped is fascinating.

    • NBC/CBS/ABC are grouped together.  Okay, I can buy that.  The differences are subtle, but it's a perfectly good way to collapse your data.
    • But CNN/MSNBC together?  That's just dumb, unless of course you believe on the part of CNN that a lack of a conservative bias equals a liberal bias.  The two are very unlike.
    • Fox News.  Gets it's own special category.  Gee, wonder why?
    • And a bunch of online stuff, like Facebook/MySpace (never mind how different they are), and Twitter, and Internet Blogs (as opposed to non-Internet blogs?).  
    • And of course there are magazines and radio and other stuff, but NOT talk radio.  Interesting.

    The rest of the survey is straightforward political stuff and the wording is not so skewed as to make the results unreliable.  But best of all, at the end, it asks for money.  Unmarked bills are acceptable, best I can tell, as long as they are between $25 and $500.

    Plus I have to certify that the answers are my own.  Oh, please.

    Wednesday, January 13, 2010

    What's Up With Cognitive Mobilization?

    Scholarly research has its fads, just like music or fashion.  Often, research chases the money -- grant money, that is.  Other times it chases the new hot thing, like Twitter or The Daily Show.  But I'm baffled by my blog statistics that consistently show people searching for cognitive mobilization.

    Yes, I've written about it here and here, and yes, my first ever published piece was on cognitive mobilization (in a forgettable academic journal that doesn't really exist any more, called Mass Comm Review.  But now it seems the topic has, if not exploded on the scene, at least generated some new interest.

    There are recent pieces in:
    • Electoral Studies (link here)
    • Communication Research (link here)
    • Political Science Review (link here)
    • The Comparative Study of Electoral Systems (link here)
    • The International Journal of Public Opinion Research (link here)
    And those are just the first ones from 2009 I happened to find in a quick-and-dirty search.  I'm sure I could have kept going with cute little dots and journal titles and links that may or may not be helpful, but you get the point.  Cognitive mobilization is one of the theories designed to explain partisan dealignment, the detachment of political party affiliations.  It's tied to changing values.  As information costs decline, so goes the theory, people will be less likely to follow simple rules of thumb, like party identification, and consider messages and politicians more carefully.

    So goes the theory.

    It's an odd increase -- or seemingly an increase -- given people at least in the U.S. seem be growing ever more partisan, dragged kicking and screaming by a fragmented partisan cable news environment, but we can't forget that a whole bunch of other people have grown apolitical.  This group doesn't fit the the theory.  They're not carefully considering messages and dealigning themselves, they're simply opting out of the political process.  So I'm a bit baffled by this, at least from a U.S. perspective, but it's nice to see as well since that old Mass Comm Review article of mine from 1991 or so looks prescient now.

    Tuesday, January 12, 2010

    Two Ways to Feel Neutral

    Now here's a clever idea, looking at neutrality.  The idea of feeling neutral about political candidates or parties or whatever.  What's clever is the author argues there are two kinds of neutrality, and that the differences really matter.  There is:
    • ambivalence, which is basically a balance between positive and negative attitudes about some object (a political actor or institution or issue)
    • indifference, a complete lack of affect but apparently not brain-dead
    So what?  There's a cool angle here, because one is truly different from the other, and the political ramifications are potentially significant.  This study in the latest issue of The Journal of Politics explores the two and why they matter in political participation.  One of the major findings is a duh moment, that ambivalent people are more likely to participate than indifferent ones.  At least they have some affect, even if by some magical algebraic formula it more or less comes out a tie.  While this is not surprising, the argument that neutrality is more than a single concept has theoretical merit and may explain other findings.

    The real question for me is not who are the indifferent -- they resemble the apolitical or chronic know-nothings -- but who are the ambivalent.  The author here uses the traditional like/dislike questions from the American National Election Studies to measure this, but I have some concerns.  Often respondents pony up likes and dislikes in odd ways and I'm not completely comfortable with calling a raw count ambivalence.  Unfortunately I don't have a better solution, and it's not my study, so let's go with it for now.  Maybe buried in the text I missed an explanation of who is indifferent versus ambivalent.  If someone finds it, let me know, but I'd love a breakdown by SES and political factors, including knowledge.

    Monday, January 11, 2010

    How People Know about Local Stuff

    When it comes to covering local, that bad old MSM (mainstream media) remains a force.  Most local news continues to originate with the traditional media, according to this new report that closely examines the "news ecosystem" of one market -- Baltimore.

    According to the report:
    And of the stories that did contain new information nearly all, 95%, came from traditional media—most of them newspapers. These stories then tended to set the narrative agenda for most other media outlets.
    Where the hell are the bloggers? 

    Friday, January 8, 2010

    Knowledge Gap Research

    The knowledge gap has been one of those long-standing mass comm research areas, not unlike agenda-setting, that refuses to die (and I wish agenda-setting would, but that's a different issue).

    The knowledge gap hypothesis "proposes that the media can increase gaps in knowledge" and as information increases, those of higher SES tend to acquire information more than those lower in SES.  A meta-analysis published in the latest issue of Journalism and Mass Communication Quarterly (volume 66, #3, Autumn, 2009, 515-532, for the citationally inclined) confirms the existence of a knowledge gap and reports a consistent moderate correlation between education and level of knowledge.

    A meta-analysis is basically a test of lots of tests.  In this study, it's 71 effect sizes published in 46 studies.  I did a meta-analysis once for a book chapter.  I'll never do another.  It's statistical hell.

    So knowledge levels differ across "social strata," as the study reports.  But the study failed to find changes over time, even in increases of publicity, which is rather interesting and "does not offer strong support for the knowledge gap hypothesis," which rests solidly on the idea of changes as time progresses and information becomes more available. 

    Now for a cool methodological moment.

    A smaller gap was found when studies used "belief-type" measures, as "compared to awareness-type and factual-type measures."  Fact-based questions are harder, thus increasing the knowledge gap that exists between those of lower and higher SES.  This makes sense.  "Belief-type" measures are apparently when respondents were asked to list arguments and these were simply counted, regardless of the quality of those responses.  "Awareness-type" measures seem to be the ability to answer questions, even evaluations of candidates.  If an answer is provides, then you are "aware" of that person regardless of the quality of that response.  It's an interesting approach and one necessary given the wide number of studies examined in the meta-analysis.  So when people can just say stuff, SES matters less.  When they have to be accurate, SES matters more.

    Thursday, January 7, 2010

    Was It That Bad?

    The 2000s were the worst decade in 50 years?  Apparently so, according to a Pew report, which states that "By roughly two-to-one, more say they have a generally negative (50%) rather than a generally positive (27%) impression of the past 10 years."

    Now that I look back, I'd have to agree.  Just look at the table below, from the report.  You've got 9/11, you've got Katrina, and you've got The Great Recession.  Good riddance, indeed.


    Wednesday, January 6, 2010

    If We Just Ask More Questions . . .

    A study in the latest issue of Public Opinion Quarterly examines the nature of political ideology, throwing at the reader a mix of cool advanced statistical methods to come to a simple conclusion -- for a lot of folks the standard single-item ideology question on a traditional 7-point scale, from liberal to conservative, just doesn't work.

    Yes, the political elite have become more partisan and ideological.  And yes, they've dragged a lot of the chattering class and even everyday people farther to the left or the right.  What this study finds is a whole lot of people still answer liberal on some policy questions and conservative on other policy questions, so for them the single-item ideological placement question fails to capture who they really are, what they really think.

    Or, as the authors note in their conclusions:
    Our results show that failing to account for the multidimensional nature of ideological preferences can produce inaccurate predictions of voting behavior for the plurality of Americans who do not call themselves liberal or conservative.
    Their recommendation?  Ask more questions.  Don't rely only on that good old fashioned liberal-to-conservative question to classify people, because for those in the middle, it fails to capture what people really think and believe.  Add policy questions since people can sometimes be fiscal conservatives and social liberals, or some other mix.

    What's the media angle here?  I think the fragmentation of the news media (see Fox vs MSNBC) is dragging a lot of people, kicking and screaming, into the same partisan divide elites began sharing many years ago.  Listen to Sean Hannity, for example, and you'll hear him push for a whole slate of consistent ideological beliefs across the spectrum.  I'd love to know if "talk radio conservatives" are more consistent in their ideological beliefs than "non-talk radio conservatives."  Ooooh, future study. 

    Study specs: Clarke, H.D. & McCutcheon, A.L. (2009).  The nature of political ideology in the contemporary electorate.  Public Opinion Quarterly, 73, 679-703.

    Tuesday, January 5, 2010

    Why Question Order Matters

    In a case of studies I wish I had done, the latest Journalism and Mass Communication Quarterly includes a research paper that finds that question order can influence how interested survey respondents say they are -- the good old political interest variable.

    What decreases political interest?  Asking them political knowledge questions beforehand.

    That seems kinda obvious, now that I see it on paper in a major journal, but I'd never given it much thought until now.  The study, by Dominic Lasorsa, is a nice and tight study that examines not only how asking political knowledge questions might affect political interest, but also neatly examines the placement of questions that might excuse one for not being very knowledgeable.  It all boils down to priming and context effects. 

    The placement of political knowledge items in a survey before a political interest question had its greatest impact among women, the less educated, those of lower income, and younger respondents.  These are folks who typically, for various reasons, do less well on political knowledge tests.  So being hit by questions that make you feel less knowledgeable leads to you saying you're less interested in politics.  There's some priming going on, but also probably some self-esteem.  Blowing some knowledge questions makes you a little more likely, I'd think, to then say politics doesn't really matter to you.  Ace 'em, and sure, politics matters.

    This latest issue of JMCQ (Autumn 2009, Volume 86, Number 3) includes "Political Knowledge" on the cover six studies inside that touch on some way political knowledge.  I'll try to discuss them as the week progresses, assuming I actually get prepared for my Spring Semester classes.