A couple of days ago I blogged about a new POQ study that found people overestimate in surveys how much they consume news media (in this instance, TV news). Young people in particular were found to significantly overestimate their use of TV news.
So what?
From a data analysis standpoint this causes all kinds of problems. Imagine you're studying how media exposure is related to political knowledge. Now imagine you're particularly interested in how young people use the news media to keep up with politics and public affairs. For young people, though, you find no significant relationship between their media consumption and what they know.
If indeed young people overestimate their media use, the typical relationship between news exposure and political knowledge, for them, may well disappear when actually it exists. In other words, their overestimation of news media suggests there ought to be a stronger relationship than actually exists because, well, they said they use the news media more than they really do. A usual positive association between news media exposure and political knowledge, for young people, may well turn negative. And that's bad news.
Can we correct for this statistically? I don't think so. Can we preface our news exposure questions to make it okay for people to admit they don't consume as much news as they might want to answer to make themselves sound better? Yup. Can we go to some measure other than mere exposure? Attention helps some, but it also rides on top of exposure and that may create similar problems.
Is time spent with the media even meaningful? In our ADD culture, skimming and grazing and getting bits and pieces of news from here and there, I'm not sure time spent with the media even works today. ANES tried some experimental media exposure items in the 2008 pre- and post-election surveys, but I'm saving that for another post on another day, but my overall sense is they're no better than the original ones. I'll do actual analysis of the two and report back.
No comments:
Post a Comment