Wednesday, March 28, 2012

Political Knowledge as a Component

Often political knowledge is treated as a dependent variable, something you're trying to explain or predict (as in, does exposure to the news lead to greater knowledge?).  Sometimes political knowledge is treated as an independent variable (as in, does greater knowledge lead to voting?).

And sometimes, political knowledge is treated as a component for some other measure. 

This is a story of the latter, a paper I stumbled across that examines religiosity, political ideology, and political engagement.  Formally, the title is The Association of Religiosity and Political Conservatism: The Role of Political Engagement.  You can find a PDF of it here.  The study is an example of using political knowledge as part of something else.  On page 11 of the paper, the authors tell us:
We conceptualize political engagement as overall involvement with political information, as manifested by a) high (vs. low) subjective importance of politics and b) high (vs. low) objective political knowledge ... These indicators tend to be correlated, but they are conceptually distinguishable.

In other words, political knowledge is like political interest, only different, but together they make for something called engagement. I could quibble with this (engagement, after all, should include behavior such as news consumption, perhaps participation), but my point today is knowledge can be used in creative ways other than merely as an independent or dependent variable. 




Monday, March 26, 2012

What People Know ... About Animals

I'm fairly good at stumbling across odd surveys or tests of what people know about politics, about health, about social security, about booze, pretty much about anything at all -- and now I can add to the list what people know about animals.  This one is more about debunking urban myths about animals.  It's less a test than a list.  Check it out.

Wednesday, March 14, 2012

Pew = Quiz Central

I love the Pew Center.  Great reports, terrific data for secondary analysis, and friendly folks if you have a question.  And I like their occasional news quiz.  But today I happened to go to their main page and glance at the far right column to find -- Quiz Central.

Maybe they've gone a little overboard?  Let's look.

There's:
  • Family Trends Test:  Compare your views with others on how families are structured.
  • Religion IQ:  Fun one in which we discover atheists know more than others about, of all things, religion.
  • News IQ:  The grandaddy of 'em all, your basic current events quiz.
  • How Millennial Are You?  This one is a 0-to-100 scale in which, I'm sure, it's better to have a low score because I really don't wanna be millennial. I've heard their music.
  • Media Tech Quiz:  Haven't done this one yet, but basically it's a news IQ quiz about technology stuff.
  • Science IQ:  Perhaps the scariest given the wingnuts out there who think evolution (like gravity) is only a theory.
  • Community Comparison: Not done this one either, but then again why would I want to compare Athens, Ga., to anywhere else?  It'd just make them embarrassed.
  • Internet Typology:  I love typologies, even if they're not validated or tested with rigorous methods.  In this one, find out your category.  Not done it yet.
  • Couples Quiz:  Another scary one.  Who calls the shots in your home?  I have the basic husband answer for this one:  "Yes dear."
Why are these here?  Because quizzes and typologies, like lists, are enormously popular.  And Pew has all this data lying around so it's good to make use of it.  

Thursday, March 8, 2012

Your News IQ. Sorta.

Lots of folks put news quizzes out there, most famously the Pew Center.  Here's one by Rasmussen College on "navigating political terminology" designed to test your true political geekiness.  Miss one or more, it recommends at the bottom, and "you may want to brush up on your politics."  Sounds tough?  Not if you look at the questions.  Almost all include a joke as one of the two possible responses, so really this is more about entertainment than seeing how much you know or don't know about campaign politics.  Then again, what's wrong with a little fun?

Wednesday, March 7, 2012

Skepticism is Not Cynicism

There's a very interesting study in the latest Journalism and Mass Communication Quarterly (and how often do you get to write those words?) that examines the differences between cynicism and skepticism, and how each plays a role the public's apathy and feelings that the government will respond to their needs (efficacy).

Simply put, skepticism good | cynicism bad.

So being skeptical leads to feeling government is more responsive, cynicism to feeling it's less responsive.  Being skeptical also leads to less apathy.  In other words, the cynics have tuned out, the skeptics are, well, skeptical, but in a good way -- that is, they're looking for more information and feel change can come.  Skeptics help fuel a democracy, cynics are a drag on it.

Okay, but what I want to get into here is the methodology.  Let's look at the two key variables.

Cynicism was measured by such questions as whether respondents agreed or disagreed with such statements as (1) candidates are interested only in votes, not people's opinions, and (2) government is run by a few big interests.  That sort of thing, all fairly generic.  No questions about media.

Skepticism was measured by such statements as (1) I think about news stories before I accept them and (2) I seek out additional information to confirm statements by politicians.  A mix of statements about media and politicians.

See the difference above?  One set of questions, measuring cynicism, doesn't mention the media at all.  The other set, measuring skepticism, has at least three of the six touch directly on the media and one indirectly does so.  Is this a problem?  Cynicism in this study is conceptualized as a lack of confidence in, or distrust of, the political system.  That's pretty global.  Here's a key quote from the study (p. 26, italics mine): "An alternative disposition to cynicism is skepticism, which also assesses individuals' critical evaluation of public affairs information sources, including elected officials and news media, but which serves a beneficial purpose rather than a debilitating one."  Somehow here is the suggestion that cynicism is also about information sources, but if you read the questions above used to measure cynicism, you see it ain't so, at least not as compared to the measures used to tap skepticism.  Why am I making a big deal about this?  In part because apparently the JMCQ reviewers didn't.  You can't treat them as so equivalent when the measures are so very different in the target.  One is global, the other is part global and about the system, part about the media.

Another useful line to examine in that quote above is that one of the concepts (skepticism) is thought to serve "a beneficial purpose" while the other (cynicism) a "debilitating one."  We generally see cynicism as bad, skepticism as good, with the latter suggesting further cognitive effort to resolve a problem.  The questions do get at this but let's be clear, only one set involves the media.  Not both.  Despite this, we have this line from the discussion section:  "Cynicism and skepticism both represent a negative posture toward public affairs and relevant media."  I believe you can say this about one but not the other, not based on the items used above to measure them.

Yeah, I'm nitpicking above.  It's a good study and includes an interesting path analysis to include satisfaction with the media.

Friday, March 2, 2012

Where People Get Retirement Info

I wrote yesterday (at some length) about an AARP survey on what people know about social security benefits.  No need for me to repeat that info, just click on the link above if you're interested.  Some of the results are neat and include not only knowledge about retirement benefits, but perceived knowledge.

Instead, I want to briefly point to a media question in the survey that asks how people learn about social security benefits.  The results probably won't surprise you, but let's cover them nonetheless.

Sources of Information about Social Security Retirement Benefits
  1. The Social Security Administration.  No surprise there, with 53 percent naming it as a source. I've done this and I'm many years away from retirement.
  2. Friends or family.  Makes sense, especially if you have older friends or family who have gone through the process.  I would have not been surprised to see this #1.
  3. Newspaper articles.  Yes, this makes me happy, at least for a moment until I realize that while older Americans read papers and thus bumps this category to #3, that's going to change as years go on.  Sigh
  4. AARP.  Makes sense.  They're the retirement folks.
  5. Current or former employer.  Again, makes sense.
  6. Financial magazine or book.  I'm surprised this isn't higher, but then again, maybe not.
  7. Financial shows on television.  Yeah, can buy this one too.
  8. Professional financial advisor.  On this one I have mixed feelings.  I kinda expected it to be even lower.  How many people have a professional financial advisor?  Hell, I don't.
  9. Financial services firm.  This, I expected to be higher than the one above. 
  10. Current or former labor union.  This one will drop, as we all know, as unions lose membership over time.
  11. Public library.  Happy to see this, would like to see it higher.
  12. Class or seminar at a local college.  This one surprised me.  Now that I think about it, yeah, makes sense.  Just never thought about it before.
What's missing from above?  The Internet.  It's possible the Net falls into several of the categories.  For example, as a prof I belong to TIAA-CREF and they email me stuff or point to their web site for retirement info, so that counts as a financial firm but also as the medium (Internet).  Same is true for the SSA or even AARP.  So my guess is, use of the Net is being folded into the responses above.
 

Thursday, March 1, 2012

What People Know ... About Social Security

As folks near retirement age, they know less about social security benefits than they should, according to a survey by the people who know a little something about retirement (AARP).

The survey says that while people know the basics, they "remain unaware of different claiming strategies that could have a significant impact on their income throughout retirement."

In other words, people getting near retirement don't know as much about social security as they should.  Only 29 percent, for example, knew that by waiting until age 70 they could get the highest monthly retirement benefit.  Even worse, 1-out-of-5 thought they could get the maximum benefit amount before retirement.

So yeah, yet another survey that shows people don't know as much as they should about insert favorite topic here.  I'm shocked, shocked to find this out.

A few words about methodology.  Unlike a lot of these "surveys" done by special interest groups, this one seems legit.  According to this page, the survey was of 2,000 adults ages 52-70.  It's not immediately obvious how they found and surveyed these lucky 2,000 respondents.  But don't worry, I'm nothing if not annoying about methodological details, so there's this fat pdf with more info.  There's some cool stuff buried in this file.  For example, the survey asked respondents not only questions about knowledge, but also perceived knowledge.  That is, what they think they know about the topic to compare with what they actually know.  Nearly half considered themselves either "very" or "somewhat" knowledgeable.  As you may know, perceived knowledge and actual knowledge tend to be correlated imperfectly with one another.  Simply put, a lot of people are good at judging their knowledge, but a lot of them aren't.  It's that latter group that's always interested me.

There is some neatly done methodology in how they create a "knowledge score" here, so I applaud that.  Indeed, the report is full of interesting graphs if you happen to study this sort of thing.  I don't, but the report is a good, solid example of reporting findings in a non-academic sense.  They even include oversamples of blacks and Hispanics to examine more closely those ethnic/racial groups.

Finally ... there is a media question or two here, and I'll report on those results later.