I mostly write here about research or how surveys were done, that sort of thing, but I have to give credit to the Obama folks. Remember Benghazi? Fox News and conservatives kept talking and talking about it, but the Obama folks moved on. "But," sputtered Republicans. "Wait," they tried again.
Remember the Obamacare rollout debacle? Sure you do. It was only a week or two ago.
And boom ... we have a historic deal with Iran.
"But," sputter Republicans. "Wait," they try again.
Friggin brilliant.
I write this not as a partisan, not as criticism of the Obama Administration or to poke fun at Republicans -- both are too easy. You can argue this is the natural flow of events, that there's always something new to talk about, some new event to discuss, some not disaster to confront. I'd say it also a nice example of agenda-setting and framing the conversation to what's new. The public has a fairly short attention span. Fox News hasn't quite figured this out yet and is still going on about the health care thing, but people will move on quickly enough, except for that partisan slice that never moves on.
What's new, that tends to be news. That works to an administration's advantage if what's new turns out to not be bad, or you can successfully spin it your way.
Plus the holidays approach. News gets lost anyway.
The research on agenda setting (which I hate, by the way) and the research on partisan media suggest Fox will have very little success in keeping the health care debacle in the public mind, except of course for its core partisan audience. If the Obama folks are smart, they'll talk about Iran and only Iran, until next week when they'll talk about something else. Congress will hold hearings about health care or who knows what else, but by then the national conversation will have moved elsewhere. Again, assuming no new health care debacle.
In other words, time is on the administration's side. As is research in how public opinion ebbs and wanes on these kinds of topics.
Random blog posts about research in political communication, how people learn or don't learn from the media, why it all matters -- plus other stuff that interests me. It's my blog, after all. I can do what I want.
Monday, November 25, 2013
Friday, November 22, 2013
UGA Freshman Names
I rarely get students interested in data stories. Here's one I've always liked: top freshman first names at UGA. I just got these for the new freshman class. A few high points:
There are a lot of ways to play with these data. One is to compare them to Social Security birthname data to see if UGA is unique or follows the trend in popularity of first names. Another is to, of course, analyze whether we have a greater number of unique names, proportionally, in each successive freshman class. That would suggest a diversity, of sorts. You'd of course then go out and find people with these names. I'd interview an Emily, for example, and ask her how often she runs into other students with the same name. How often does a prof say "Emily?" and three kids raise their hands? You get the idea. Journalism 101.
- Either Sarah or Emily have been the #1 name since 2008 (my earliest data). Emily is #1 in 2013, Sarah #2.
- To give context, in 2013 there are 76 freshman named Emily, 75 named Sarah. So about 3 percent of all freshman students have one of those two names.
- Male names are less consistent. William was more popular, then Michael, now it's John (#3 in 2013).
- No surprise as we have more female than male students, of the Top 5 names every year, three are female.
- In 2013, Anna and Hannah make a first-ever appearance in the Top 10 (#4 and #5, respectively).
- My daughter Erin is in the 2013 freshman class. She shares that name with 19 other kids.
- There are 524 students in the 2013 freshman class who are the only ones with their first name.
There are a lot of ways to play with these data. One is to compare them to Social Security birthname data to see if UGA is unique or follows the trend in popularity of first names. Another is to, of course, analyze whether we have a greater number of unique names, proportionally, in each successive freshman class. That would suggest a diversity, of sorts. You'd of course then go out and find people with these names. I'd interview an Emily, for example, and ask her how often she runs into other students with the same name. How often does a prof say "Emily?" and three kids raise their hands? You get the idea. Journalism 101.
Thursday, November 21, 2013
Learning from Political Humor
There's been loads of work on whether or not people learn from the humor of The Daily Show with Jon Stewart or The Colbert Report. I've published stuff on this. But how about humor from the candidates themselves?
This study suggests humor can distract people if it's unrelated to the theme of the discussion. It can increase learning if thematically related. The study ties learning (elaboration) to differences in people's Need for Humor.
What's Need for Humor? It's an individual difference, measured not unlike similar constructs like the time-tested Need for Cognition (how much ya like to think about stuff). The humor variable (NFH) is defined thusly:
This study suggests humor can distract people if it's unrelated to the theme of the discussion. It can increase learning if thematically related. The study ties learning (elaboration) to differences in people's Need for Humor.
What's Need for Humor? It's an individual difference, measured not unlike similar constructs like the time-tested Need for Cognition (how much ya like to think about stuff). The humor variable (NFH) is defined thusly:
‘‘Whereas individuals with high levels of NFH may be motivated to process the humorous information, relevant or not, those with low NFH are not likely to bother elaborating on humorous information.’’I love a good personality variable. Basically they find this:
- High NFH folks, thematically related humor helps them learn.
- Low NFH folks, thematically related humor hurts learning.
Monday, November 18, 2013
By Radio News I Mean ...
In research we often ask what I call global media exposure questions. You know, how often (or how many days) do you watch television news or read a newspaper. Generic questions, used for decades, and questions growing less and less useful. A lot of surveys now use "paper newspapers" and ask about Internet news sites. That's a nice improvement, but more and more surveys also ask about specific network use (Fox, MSNBC, etc.) or even down to the program level (The O'Reilly Factor, etc.).
Here, I'm talking about radio news.
When we ask someone how often they get their news from radio, consider what might be going through their minds? NPR? Those bits of news between music? Talk radio? The generic question is too, well, generic. It fails to capture the highly segmented nature of radio news.
Allow me to prove my point with some analyses I'm in the middle of and should be finishing instead of writing this. Let's look at belief in conspiracy theories. I'm looking at four -- two from the left and two from the right.
Radio Exposure: Generic exposure is mostly not related to any of the theories except the one about the death panels.
NPR: Listening to NPR makes you less less likely to believe in any of the theories, be they from the left or the right.
Rush Limbaugh: Listening to Limbaugh makes you more likely to believe the anti-Obama conspiracy theories and less likely to believe the leftish theories.
So when we ask about generic radio, we're missing a lot. Look at the NPR listeners, they're less likely to believe in any of this stuff. Of course the audience for NPR is better educated, but I controlled for the impact of education (and age, and income, and other stuff). Limbaugh's effect, even beyond these controls and even controlling for ideology and partisanship, still makes listeners more likely to believe the anti-Obama stuff and less likely to believe the anti-Bush stuff. Radio itself? Not much going on, which suggests by relying on a generic measure you can lose a lot of explanatory power.
What's different about these audiences, even after statistical controls? The Limbaugh audience is more anxious about their financial situation, the NPR audience less so. Anxiety and uncertainty can play a direct role in believing in conspiracy theories of any kind, not necessarily partisan ones. That may be the clue here, one that deserves further study.
Here, I'm talking about radio news.
When we ask someone how often they get their news from radio, consider what might be going through their minds? NPR? Those bits of news between music? Talk radio? The generic question is too, well, generic. It fails to capture the highly segmented nature of radio news.
Allow me to prove my point with some analyses I'm in the middle of and should be finishing instead of writing this. Let's look at belief in conspiracy theories. I'm looking at four -- two from the left and two from the right.
- Theories from the Right: The "birther" belief that Obama was born outside the U.S. and that the health care law contained "death panels."
- Theories from the Left: That the government knew in advance of 9/11 and that the feds directed Hurricane Katrina flood waters into poor New Orleans neighborhoods on purpose.
Radio Exposure: Generic exposure is mostly not related to any of the theories except the one about the death panels.
NPR: Listening to NPR makes you less less likely to believe in any of the theories, be they from the left or the right.
Rush Limbaugh: Listening to Limbaugh makes you more likely to believe the anti-Obama conspiracy theories and less likely to believe the leftish theories.
So when we ask about generic radio, we're missing a lot. Look at the NPR listeners, they're less likely to believe in any of this stuff. Of course the audience for NPR is better educated, but I controlled for the impact of education (and age, and income, and other stuff). Limbaugh's effect, even beyond these controls and even controlling for ideology and partisanship, still makes listeners more likely to believe the anti-Obama stuff and less likely to believe the anti-Bush stuff. Radio itself? Not much going on, which suggests by relying on a generic measure you can lose a lot of explanatory power.
What's different about these audiences, even after statistical controls? The Limbaugh audience is more anxious about their financial situation, the NPR audience less so. Anxiety and uncertainty can play a direct role in believing in conspiracy theories of any kind, not necessarily partisan ones. That may be the clue here, one that deserves further study.
Saturday, November 16, 2013
Researching the Obvious
"In mass comm," I tell students, "we dare research the obvious."
I'm halfway joking, of course. Just halfway. But here's a wonderful real-world example of researching the obvious, well pointed out by The Washington Post:
The results of a seven-month survey that cost Fairfax County schools $180,000 released this week did not surprise anyone familiar with teenage eating habits: Students hate the food served at school cafeterias.Well, duh.
The survey isn't without its interesting findings. There's a wide gap in perception, for example:
Only 22 percent of students said the cafeteria food was nutritious compared to 94 percent of the schools’ food and nutrition services workers.Of course this is misleading because (1) kids have no objective way of evaluating the nutrition level of the food and (2) a lot of them can't even spell nutrition and (3) their attitudes about the taste is affecting their judgment of nutrition. In other words, misleading is putting it kindly because it really isn't measuring what you think it's measuring, unless you're measuring misperception.
Thursday, November 14, 2013
What Makes a Conspiracy Believer?
Who loves a conspiracy theory?
Obviously, those on the left love a good theory that makes those on the right look bad, and vice versa. But who loves a conspiracy theory regardless of which side gets hammered?
I'm in the initial stages of building a paper on this topic, looking also at the role of media use, but here's the one consistency I've found when it comes to believing in conspiracy theories -- economic uncertainty.
Folks who feel uncertain now or in the future about their economic situation are more likely to believe in conspiracy theories from the right-wing crazies (Obama born outside U.S., death panels) and conspiracy theories from the left-wing crazies (government knew in advance of 911, aimed Katrina waters at poor neighborhoods). This stands up even in regression models, which means if you statistically control for the influence of age and education and ideology and party identification and a bunch of other stuff, those who are uncertain about their economic condition are more likely to believe conspiracy theories of the left and the right. This is important, as most people tend to believe one side or the other -- but not both.
This actually fits some theory I'm reading while putting together this paper, that uncertainty can result in more belief in conspiracy theories. Another aspect of that theory is uncertainty pushes people to attend more heavily at the moral aspects of a person or institution and, thus, make them even more likely to believe such hokum. I've not build the moral part into my analysis yet.
There's some damn good potential in this paper both as an academic journal article and in that place known as the real world.
Update
Even better, and logically, a measure of neuroticism (anxiety) acts as an even better predictor in belief in all of the conspiracy theories, regardless of their partisan bent. Interesting.
Obviously, those on the left love a good theory that makes those on the right look bad, and vice versa. But who loves a conspiracy theory regardless of which side gets hammered?
I'm in the initial stages of building a paper on this topic, looking also at the role of media use, but here's the one consistency I've found when it comes to believing in conspiracy theories -- economic uncertainty.
Folks who feel uncertain now or in the future about their economic situation are more likely to believe in conspiracy theories from the right-wing crazies (Obama born outside U.S., death panels) and conspiracy theories from the left-wing crazies (government knew in advance of 911, aimed Katrina waters at poor neighborhoods). This stands up even in regression models, which means if you statistically control for the influence of age and education and ideology and party identification and a bunch of other stuff, those who are uncertain about their economic condition are more likely to believe conspiracy theories of the left and the right. This is important, as most people tend to believe one side or the other -- but not both.
This actually fits some theory I'm reading while putting together this paper, that uncertainty can result in more belief in conspiracy theories. Another aspect of that theory is uncertainty pushes people to attend more heavily at the moral aspects of a person or institution and, thus, make them even more likely to believe such hokum. I've not build the moral part into my analysis yet.
There's some damn good potential in this paper both as an academic journal article and in that place known as the real world.
Update
Even better, and logically, a measure of neuroticism (anxiety) acts as an even better predictor in belief in all of the conspiracy theories, regardless of their partisan bent. Interesting.
Monday, November 11, 2013
Conspiracy Theories ... Still Out There
Politico ran a story today about conspiracy theories that's worth a read if you're into them (as I am). It also got me thinking, so I dipped into my own data for some quick analyses. Who believes in a particular conspiracy theory? Basically, those who want to believe in a particular conspiracy theory.
So let's look at four popular theories below, two that you might call getting oxygen from the right, two from the left. The wacky right has Obama born outside the U.S. (birther thing), and the so-called "death panels" of Obamacare. The wacky left has the notion the government knew in advance about 911, and the idea Hurricane Katrina's flood waters were purposely aimed at poor sections of New Orleans.
This is what I did (nerd warning). I constructed four quick-and-dirty regression models on each of the four conspiracies above. That means I tossed a kitchen sink full of things into a model to let them fight it out and see which ones remain statistically significant, which are posers and drop out. Let's look at the results:
Fox News deserves special mention. Even after a host of statistical controls, some of which were not significant in the models after all this other stuff is considered, watching Fox News is consistently related toward believing the anti-Obama stuff and not believing the anti-Bush stuff. Other media factors come and go, but Fox News is a special case.
So let's look at four popular theories below, two that you might call getting oxygen from the right, two from the left. The wacky right has Obama born outside the U.S. (birther thing), and the so-called "death panels" of Obamacare. The wacky left has the notion the government knew in advance about 911, and the idea Hurricane Katrina's flood waters were purposely aimed at poor sections of New Orleans.
This is what I did (nerd warning). I constructed four quick-and-dirty regression models on each of the four conspiracies above. That means I tossed a kitchen sink full of things into a model to let them fight it out and see which ones remain statistically significant, which are posers and drop out. Let's look at the results:
- Obama born outside U.S. -- Believers tend to be older, less educated, of lower income, female, read less newspapers, but more politically conservative and Republican. Also, even after all these statistical controls, watching Fox News made you more likely to believe this.
- Death Panels -- Believers tended to be younger, less educated, of lower income, Republican and conservative. Also, less likely to read a newspaper but, yes, Fox News watchers. Even after all these controls.
- Government Knew of 911 -- Believers were younger, less educated, of lower income (seeing a trend here?), non-white, less likely to read newspapers, less likely to watch Fox News, more liberal and more likely Democrat.
- Government Directed Katrina at Poor -- Believers were younger, less educated, lower income (sigh, yes), were more likely women, black, liberal, used less Internet news but read newspapers more, and less likely to watch Fox News.
Fox News deserves special mention. Even after a host of statistical controls, some of which were not significant in the models after all this other stuff is considered, watching Fox News is consistently related toward believing the anti-Obama stuff and not believing the anti-Bush stuff. Other media factors come and go, but Fox News is a special case.
Labels:
barack obama,
conspiracy theories,
fox news,
george w. bush,
politico
A Grad Class -- Maybe
I have a slot for Summer 2014 to teach a special topics graduate-level class at UGA. It can be about anything. In the past I've done summer seminars on the effects of social media, the intersection of humor and politics, and religion and media. This time I'm mulling over something less fun, a course in secondary analysis. As Wikipedia notes, secondary analysis is "data collected by someone other than the user." The entry continues:
Hence ... my possible class, if it can attract students.
There is so much data available. Interested in health? Kids and health? Try getting IRB approval for that. But there's lots of data out there, some collected by the feds, some by others, that even dips into such sensitive topics as drug use. Interested in lifestyle stuff? Internet use? And politics, interested in politics? Oh man, we got your data right here. Well, not here, but at places like GSS, ANES, Pew, Census, and a bunch of federal data best found via ICPSR.
I'm still plotting this class, but basically it'd look something like this:
Oh, and the dirty little secret we don't teach students in research methods -- sometimes you skim an interesting dataset and the study will jump out at you. This has happened to me a number of times, including a study that will soon be published in the Journal of Broadcasting & Electronic Media, or the one I'm messing with now where I found a bunch of questions asking what language Latinos prefer to get their news. Having fun with that one and even found a good theoretical base having to do with ties to the ancestral country and political participation. Shhhh, don't mention this to our research methods instructors.
Will the class make? For the academically uninitiated, that means enough students sign up to justify holding the class. All you usually need is five butts in seats, and it's possible it might attract folks from other colleges (political science, maybe, or sociology, or public health).
And of course it all depends on my own single vocal cord getting repaired next month so I can croak out a lecture. Hell, everything rides on that. Luckily, I don't have to make a decision on the class until early Spring.
Common sources of secondary data for social science include censuses, organisational records and data collected through qualitative methodologies or Qualitative research. Primary data, by contrast, are collected by the investigator conducting the research.There are a lot of reasons to like secondary analysis. Someone else has done most of the work, there are no pesky IRBs to deal with, and you get these great national or international samples and therefore terrific generalizability. So what's not to like? You don't always have access to concepts you'd like, measured in ways you'd prefer them measured, and there can be a learning curve when it comes to identifying, downloading, cleaning, and analyzing the data. A steep curve.
Hence ... my possible class, if it can attract students.
There is so much data available. Interested in health? Kids and health? Try getting IRB approval for that. But there's lots of data out there, some collected by the feds, some by others, that even dips into such sensitive topics as drug use. Interested in lifestyle stuff? Internet use? And politics, interested in politics? Oh man, we got your data right here. Well, not here, but at places like GSS, ANES, Pew, Census, and a bunch of federal data best found via ICPSR.
I'm still plotting this class, but basically it'd look something like this:
- What is secondary analysis? How it differs from other methods.
- Read studies that use secondary analysis. Break 'em down.
- Limitations and strengths of the method.
- All the different places data exist.
- Identify concepts students are interested in.
- How to find your data.
- Downloading and cleaning.
- Your friend, SPSS, for analysis.
- Recoding variables, getting it right.
- Running analyses.
- Writing up your results as a publishable paper.
Oh, and the dirty little secret we don't teach students in research methods -- sometimes you skim an interesting dataset and the study will jump out at you. This has happened to me a number of times, including a study that will soon be published in the Journal of Broadcasting & Electronic Media, or the one I'm messing with now where I found a bunch of questions asking what language Latinos prefer to get their news. Having fun with that one and even found a good theoretical base having to do with ties to the ancestral country and political participation. Shhhh, don't mention this to our research methods instructors.
Will the class make? For the academically uninitiated, that means enough students sign up to justify holding the class. All you usually need is five butts in seats, and it's possible it might attract folks from other colleges (political science, maybe, or sociology, or public health).
And of course it all depends on my own single vocal cord getting repaired next month so I can croak out a lecture. Hell, everything rides on that. Luckily, I don't have to make a decision on the class until early Spring.
Labels:
ANES,
Census,
grady college,
GSS,
pew,
research methods,
secondary analysis,
university of georgia
Wednesday, November 6, 2013
Fox News: Career Maker
Talk radio got me tenure. I published lots of studies in the 1990s about its influence, made a small name for myself in that narrow field.
Fox News, it's ripe for the plucking.
I sent off a piece today that features exposure to Fox in a complicated matter of trust in government and perceptions of the election outcome. Simply put, supporters of Romney in 2012 were of course more likely to predict he'd win the election. Supporters who also watched Fox News -- even more so. By a significant margin.
And here's a recent study in Public Opinion Quarterly that also finds Fox is a special case among media. From the abstract:
Fox News, it's ripe for the plucking.
I sent off a piece today that features exposure to Fox in a complicated matter of trust in government and perceptions of the election outcome. Simply put, supporters of Romney in 2012 were of course more likely to predict he'd win the election. Supporters who also watched Fox News -- even more so. By a significant margin.
And here's a recent study in Public Opinion Quarterly that also finds Fox is a special case among media. From the abstract:
Fox News viewers are particularly likely to support voter ID laws, though no other forms of media use are significantly related to support.So it's time I devoted all my attention to Fox studies. Seems ya can't go wrong.
Saturday, November 2, 2013
Poll Irony Alert
A new Virginia poll has the gubernatorial race there almost a dead heat. I bring this up not because I care, but because there's an ironic twist.This poll, sponsored by the Emerson College Polling Society, appears to be a robo-poll. You can see the pdf of it here. Check the bottom for some methodological details.
Plus, robo-polls -- those annoying kind over the phone where it's not a real person -- also tend to skew older, Republican, conservative. Hence, the idea that in this poll the gap is smaller (favoring the GOP candidate in this case) raises concern, especially compared to a different poll that had a wider gap.
First, it's data were, not was. Second, it's confidence level, not confident level. Now, the irony. This is a student organization, yet it used what appears to be a robo-poll, which by definition cannot call cell phones -- the very key aspect of college students' very existence.Data was collected onOctober 25-30 using an automated data collections system. The Virginia sample consisted of 874 registered, likely voters with a margin of error of +/-3.24% at a 95% confident level.
Plus, robo-polls -- those annoying kind over the phone where it's not a real person -- also tend to skew older, Republican, conservative. Hence, the idea that in this poll the gap is smaller (favoring the GOP candidate in this case) raises concern, especially compared to a different poll that had a wider gap.
Friday, November 1, 2013
Sex@UGA
The Red & Black, our independent student newspaper, did a big piece about the hookup culture at UGA that includes a survey of students about their sexual habits.
As I often analyze media surveys here -- the good and the bad -- it seems only fair I turn my attention to the good folks at The R&B. First, I recommend you follow the link above and read the story yourself and then continue with my analysis. I'm not talking here about the story, but the underlying survey. I could go for days about the story itself.
Okay, done? Let's move on. First, allow me to copy-and-paste the opening graph. It's important.
Note the info above, that 2,130 freshmen and sophomores were sent surveys via email. Why only this many? By my count, there are 5,197 freshman and 5,892 sophomores in Fall 2013 (yes, I can look this up in about 10 seconds). Are these 2,130 a random sample of that larger pool? If not, you've already taken a wrong turn. It's possible this is how many students had not "restricted" access to their data, and hence their email addresses. If so, you've skewed the sample again. Who knows what differences may exist between students who choose to restrict access, but certainly they bias the sample in some way.
Now note in the info above that of 2,130 students surveyed, only 146 replied. That's not a terrible response rate, but typically we survey a lot more folks so we have enough completed interviews. An N=146 gives us a margin of error of 8 percent, give or take. Most surveys shoot for 3 percent, at worse 4 percent. That's why you often see surveys with the magic number of around 1,000 completed interviews, meaning we often call or email 10,000 or more -- randomly.
Even anonymous surveys have issues with sensitive questions. People tend to over-inflate positive behaviors (attending worship services, voting, watching PBS), and under-report negative behaviors (drug use, sexual habits, drinking light beer). There's a graphic at the bottom of the article that sums up the results. Someone said they had 12 sex partners. I'd probably toss that as bad data because, frankly, I don't believe it.
The questions themselves seem straightforward, but that's only on the surface. With sensitive questions you want to preface them in such a way as to encourage an honest response. With voting, for example, we'll often preface the question of whether someone voted by saying something like "some people get busy or sick and can't vote, and some people do. How about you? Did you vote on election day?" That sort of thing. Make it easier for someone to answer honestly. Here, I don't see that. Indeed, they could have spent some time researching how to ask sensitive questions in surveys. I can't say anything about the question order -- which can have a huge effect on results -- because I don't see the questionnaire itself. In the news biz, make all this available so we can judge it ourself.
I could go on, but it's the Friday of Fall Break and honestly I have other things to do (er, I'm in my office, but I'm trying to finish a manuscript).
As I often analyze media surveys here -- the good and the bad -- it seems only fair I turn my attention to the good folks at The R&B. First, I recommend you follow the link above and read the story yourself and then continue with my analysis. I'm not talking here about the story, but the underlying survey. I could go for days about the story itself.
Okay, done? Let's move on. First, allow me to copy-and-paste the opening graph. It's important.
Editor's note: For this study, 2,130 freshmen and sophomores at UGA were sent surveys via email. Of this pool of UGA students, 146 people responded to various questions regarding sexual orientation, alcohol, health and metrics regarding number of sexual partners and sexual encounters. Answers were received anonymously. For the purpose of this survey, "hooking up" was defined for survey takers as vaginal, anal and/or oral sex.When analyzing a survey you often begin with the sample -- its size, its quality, how the survey was conducted. Then you turn your attention to question wording, question order, and a host of other factors that in jour3410 I explain to my journalism students.
Note the info above, that 2,130 freshmen and sophomores were sent surveys via email. Why only this many? By my count, there are 5,197 freshman and 5,892 sophomores in Fall 2013 (yes, I can look this up in about 10 seconds). Are these 2,130 a random sample of that larger pool? If not, you've already taken a wrong turn. It's possible this is how many students had not "restricted" access to their data, and hence their email addresses. If so, you've skewed the sample again. Who knows what differences may exist between students who choose to restrict access, but certainly they bias the sample in some way.
Now note in the info above that of 2,130 students surveyed, only 146 replied. That's not a terrible response rate, but typically we survey a lot more folks so we have enough completed interviews. An N=146 gives us a margin of error of 8 percent, give or take. Most surveys shoot for 3 percent, at worse 4 percent. That's why you often see surveys with the magic number of around 1,000 completed interviews, meaning we often call or email 10,000 or more -- randomly.
Even anonymous surveys have issues with sensitive questions. People tend to over-inflate positive behaviors (attending worship services, voting, watching PBS), and under-report negative behaviors (drug use, sexual habits, drinking light beer). There's a graphic at the bottom of the article that sums up the results. Someone said they had 12 sex partners. I'd probably toss that as bad data because, frankly, I don't believe it.
The questions themselves seem straightforward, but that's only on the surface. With sensitive questions you want to preface them in such a way as to encourage an honest response. With voting, for example, we'll often preface the question of whether someone voted by saying something like "some people get busy or sick and can't vote, and some people do. How about you? Did you vote on election day?" That sort of thing. Make it easier for someone to answer honestly. Here, I don't see that. Indeed, they could have spent some time researching how to ask sensitive questions in surveys. I can't say anything about the question order -- which can have a huge effect on results -- because I don't see the questionnaire itself. In the news biz, make all this available so we can judge it ourself.
I could go on, but it's the Friday of Fall Break and honestly I have other things to do (er, I'm in my office, but I'm trying to finish a manuscript).
Subscribe to:
Posts (Atom)