Except it's not safety, it's based on an SGA survey asking students about safety. You can see it yourself in this news story, complete with a couple of graphics breaking down the differences in perceived safety on campus, and in Athens.
It's a good story idea. And it's an example of how not to do a poll story.
Here's what we don't know:
- Do the 1, 2, 3, 4, 5 (we call these "response alternatives" in the survey biz) reflect anything? Was a 5 "very safe" and a 1 "not safe at all"? Need context. Indeed, I'd need to look at the whole questionnaire, because I worry too about question order. In a real survey, you'd randomize the order of these two questions. If not, one response could affect the other.
- And this lede: "A survey from Student Government Association rated off-campus safety 3 out of 5." What the hell is a 3 out of 5? I'm staring at that graphic, and I still don't get it. Three out of five? If you compute a mean score of all the survey questions, you do get a 2.9 (round to 3) for that question. If you're curious, the mean for the "on camps" question is 3.8. But I had to do this by hand. And I shouldn't have.
- How was this survey done? Yeah, I see there were 334 responses, but to what, people hanging out in Tate Plaza annoying hungry students? An online poll? A random sample, or a convenience sample or, even worse, a SLOP? This matters, and journalists are supposed to ask these questions about polls and make this information available.
No comments:
Post a Comment