As I often analyze media surveys here -- the good and the bad -- it seems only fair I turn my attention to the good folks at The R&B. First, I recommend you follow the link above and read the story yourself and then continue with my analysis. I'm not talking here about the story, but the underlying survey. I could go for days about the story itself.
Okay, done? Let's move on. First, allow me to copy-and-paste the opening graph. It's important.
Editor's note: For this study, 2,130 freshmen and sophomores at UGA were sent surveys via email. Of this pool of UGA students, 146 people responded to various questions regarding sexual orientation, alcohol, health and metrics regarding number of sexual partners and sexual encounters. Answers were received anonymously. For the purpose of this survey, "hooking up" was defined for survey takers as vaginal, anal and/or oral sex.When analyzing a survey you often begin with the sample -- its size, its quality, how the survey was conducted. Then you turn your attention to question wording, question order, and a host of other factors that in jour3410 I explain to my journalism students.
Note the info above, that 2,130 freshmen and sophomores were sent surveys via email. Why only this many? By my count, there are 5,197 freshman and 5,892 sophomores in Fall 2013 (yes, I can look this up in about 10 seconds). Are these 2,130 a random sample of that larger pool? If not, you've already taken a wrong turn. It's possible this is how many students had not "restricted" access to their data, and hence their email addresses. If so, you've skewed the sample again. Who knows what differences may exist between students who choose to restrict access, but certainly they bias the sample in some way.
Now note in the info above that of 2,130 students surveyed, only 146 replied. That's not a terrible response rate, but typically we survey a lot more folks so we have enough completed interviews. An N=146 gives us a margin of error of 8 percent, give or take. Most surveys shoot for 3 percent, at worse 4 percent. That's why you often see surveys with the magic number of around 1,000 completed interviews, meaning we often call or email 10,000 or more -- randomly.
Even anonymous surveys have issues with sensitive questions. People tend to over-inflate positive behaviors (attending worship services, voting, watching PBS), and under-report negative behaviors (drug use, sexual habits, drinking light beer). There's a graphic at the bottom of the article that sums up the results. Someone said they had 12 sex partners. I'd probably toss that as bad data because, frankly, I don't believe it.
The questions themselves seem straightforward, but that's only on the surface. With sensitive questions you want to preface them in such a way as to encourage an honest response. With voting, for example, we'll often preface the question of whether someone voted by saying something like "some people get busy or sick and can't vote, and some people do. How about you? Did you vote on election day?" That sort of thing. Make it easier for someone to answer honestly. Here, I don't see that. Indeed, they could have spent some time researching how to ask sensitive questions in surveys. I can't say anything about the question order -- which can have a huge effect on results -- because I don't see the questionnaire itself. In the news biz, make all this available so we can judge it ourself.
I could go on, but it's the Friday of Fall Break and honestly I have other things to do (er, I'm in my office, but I'm trying to finish a manuscript).