Thursday, April 20, 2017

Survey Says ...

There's a survey-based story in today's The Red & Black about how students pay for college. Good idea for a story and it includes lots of useful information and interviews, so I'm not nitpicking the story so much as I am the survey itself, which is what I do given I also teach Grady's graduate public opinion class. Here's a key graf:
The Red & Black completed a survey in which 100 UGA students shared their own encounters with payments throughout college and found over half of respondents have their rent and utilities paid for by their parents.
First let's go with the information provided. A sample size of 100 means the margin of error is 10 percent. That means the 49.5 percent who do not consider themselves financially independent could actually be between 39.5 and 59.5 percent, and a 10 percent MOE means some of the results are actually statistical ties. And that assumes it's a random sample, which is the only kind in which you can truly apply a margin of error. We don't get a lot of methodological detail here. How was the survey conducted? When? How were respondents selected? Is this a convenience sample? A SLOP?

In fairness to the audience you should make clear in a sentence or two how the survey was conducted, and if it's non-scientific, say so high in the story so the reader can approach it with healthy skepticism.  If that's in the story and I missed it, please let me know.

Finally, a journalism point. A survey of 100 is really more of a man-on-the-street approach, but on steroids given we often interview at most six people in that sort of thing.






Monday, April 17, 2017

Face-To-Face vs Online Surveys

The general idea is controversial or sensitive survey questions get very different results if asked to respondents face-to-face or on the phone versus a more impersonal approach, such as online. Let's look to see if that's the case using fresh ANES 2016 election data and whether a survey was online versus face-to-face.

  • Is Obama a Muslim? Percent who say Yes:
    • F2F: 31.1%
    • Online: 30.3%
  • Voted for Trump
    • F2F: 41.3%
    • Online: 39.2%

OK, let's look at these first two above. Clearly the Obama Muslim question has no mode effect, in other words asking it face-to-face versus online makes no difference given it's 31.1 versus 30.3 percent thinking he's Muslim. On voting for Trump there's no real difference either. We're talking a couple of percentage points, nothing significant or at least not substantive. OK, let's try another.

  • Should transgender people have to use their birth sex bathroom?
    • F2F: 50.2%
    • Online: 52.6%
Slight difference above, but again it's slim, too slim to have any real meaning. How about something more mundane.
  • Do you attend religious services?
    • F2F: 64.5%
    • Online: 58.2%
There's something going on in the question above, one that fits theory. Usually a socially desirable response (attending church, etc.) gets more "yes" responses in a phone or face-to-face survey, and that's the case here by several percentage points, enough that I'd argue survey mode matters. Here's another that's kinda interesting below:
  • Favor a wall on Mexican border
    • F2F: 29.5%
    • Online: 33.4%
That's a decent spread above, enough that I'd argue we have a small mode effect. Respondents were a little more willing to favor building a wall in the online group versus the face-to-face group.

I can do this all day, but I'm running out of time. As we can see above, there are mode effects and they do matter, but sometimes they don't matter at all.

Wednesday, April 12, 2017

In My Mail

I got some lovely mail the other day, a big white envelope that said:

DO NOT DESTROY
OFFICIAL DOCUMENT

It's tax time and there's nothing scarier than official documents that you should not destroy. I opened it to find a 2017 Congressional District Census and a questionnaire that is, of course, designed to measure objective opinions on the issues of our time. Here are some of the questions:

  • The first asks how I identify myself politically. The choices are conservative Republican, Independent voter who leans Republican, Democrat, Moderate Republican, Liberal Republican, or Tea Party Member. Notice there's one choice for Democrats but all kinds of flavors for Republicans.
  • Media use is great. They offer all kinds of possible responses to how I regularly receive my political news. So, for example, there's NBC/CBS/ABC as one choice. That's okay, the three broadcast networks lumped together. As someone who seriously researches this stuff I can live with that. But then there's CNN/MSNBC together, which is silly. FOX News gets it own category, as do newspapers, radio, blogs, etc. And if wouldn't be Republicans if they didn't put a category called, and I'm not making this up, "Social Network." Not Facebook or Twitter, not social networks, but in all caps and singular, as if there's only one.
  • On what five issues I think should immediately be acted on, every single one of them is written toward a conservative response. My favorite? "Cancel unconstitutional executive orders issued by Barack Obama." Clearly irony escapes these folks, given present circumstances.
  • Here a question that's not leading at all -- "Do you agree that President Donald Trump and our Republican leaders in Congress should be aggressive in working to pass legislation to create jobs, cut taxes and regulations, end economic uncertainty and make America more competitive?" Really? That's a survey question?
  • Here's another winner: "The Democrats' fixation on "climate change" has led to costly regulations that are negatively impacting our nation's economy. Do you think climate change is a major threat to our nation?" Ya know, let's put that "climate change" in quotes. Because why not?
  • There is some good stuff here, like whether I support sending ground troops to Iraq and Syria, but most of them are complete bullshit.
Lemme be clear, this is not a real survey, it's really a way to raise money. They ask all these questions and when you finally get to the bottom there's a way to "better deliver our message as we fight to Make America Great Again."


Tuesday, April 4, 2017

Mode and Vote Exectations

In addition to asking presidential candidate preference, we also often ask survey respondents to predict the election outcome. Indeed, there's some evidence that the second question is more accurate than the first, at least in terms of gauging an electoral result. Obviously that didn't happen this past presidential election year -- both preference and expectation called it wrong.

A lot of what we're looking at, when it comes to Donald Trump, is whether survey mode (face-to-face, phone, online, etc.) affects his results. The hypothesis is that on the phone or face-to-face, respondents are a little less willing to voice their Trump support. That's the hypothesis, but Pew just published a big thing on this and found no mode effect. Here I'm looking at predictions of who will win and survey mode, based on freshly released 2016 ANES data. Caveat -- this is an early, advanced release, and a cleaner version of the data will be released soon.

Here's what I've got so far, hacking away between classes.

Using weighted data, we find that more people anticipated a Hillary Clinton win than Trump. Overall, in weighted data, 61.3 percent predicted Clinton would win, 34.8 percent predicted Trump would win (the rest are scattered across "other" or refused and so on).

OK, how about mode?

In this case we're comparing face-to-face surveys with a web-based (online) surveys. Glancing at the results, I don't see much of a mode effect on predicting the winner. Turns out, in face-to-face and web-based, the same number of people predicted Trump would win (34.8 percent). Clinton had slightly higher in face-to-face (64.3 percent) than online (60.3 percent), but that's not all that big a gap.

Simply put, survey mode made no substantive difference in who people predicted would win the 2016 presidential election. What's fascinating, at least to me, is best I can tell this is the largest "miss" in this ANES question going all the way back to 1952. I'll write more on that another day when I can dig up my data from 1952 to 2012.

Monday, April 3, 2017

Georgia Obesity

First, the good news. Athens-Clarke is the third least obese county in Georgia with 25 percent listed as obese. Who's better than my home county? Two metros, Fulton and Cobb, and the differences are 1 percentage point so consider it all within the margin of error.

The most obese counties? The Top Five are below, with percentage of folks considered obese according to fresh new 2017 health data.
  1. Clayton (38%)
  2. Baldwin (37%)
  3. Worth (37%)
  4. Macon (36%)
  5. Emanuel (36%)
Those counties also are in the top quartile for smoking, just as an added punch to their ever-expanding gut.





Driving Alone

Another post based on my playing with 2017 health data, today's installment looking at "driving alone" in your long commute, which is considered -- believe it or not -- as unhealthy. So in Georgia what counties lead the way in driving alone? Glad you asked since I cranked the data and need to do something with it. Below, our Top Ten with, in parentheses, the percentage who report driving alone on long commutes.

  1. Paulding (63%)
  2. Heard (60%)
  3. Bryan (58%)
  4. Jasper (58%)
  5. Brantley (58%)
  6. Talbot (57%)
  7. Crawford (55%)
  8. Pike (55%)
  9. Effingham (55%)
  10. Hancock (55%)
By the way, the county with the fewest folks percentage-wise driving alone is Dougherty.  Clarke County, where I live, is 148th.

So what can we tell about the counties above? Folks who live there have to drive elsewhere to work.

Sunday, April 2, 2017

Guns on Campus

Back in 2016 when the Georgia legislature was considering "campus carry" I wrote about the irony of rules for faculty and staff that prohibit weapons on campus. I am updating that post as the latest "campus carry" bill awaits the governor's signature or veto.

(My money is on him signing it as the exceptions meet some of his concerns last year when he vetoed it)

Quite simply, if the governor signs into law "campus carry," UGA will have to think about its own rules that ban employees from having weapons on campus. There's probably a story here. Just saying.

UGA, in its employment section on workplace violence, says it's against the rules to "possess, use, or threaten to use an unauthorized weapon as defined by the Policy." What's a weapon? At the end of the page it defines a weapon as "any objects that may be used to intimidate, attack, or injure another person or to damage property" and then points you to a page that doesn't exist: http://www.police.uga.edu/weapons.html. Don't bother, it's a dead link. Here's an archived version or here is an updated version with a different URL that sums it up for you. I did all the work. Enjoy.

Take a look at that list of weapons. It includes bats. So you baseball and softball players, you're in technical violation of the law? Nope. There's an exception for "legitimate athletic purposes" and a whole long list of others the law doesn't apply to.

But the real point here is, if the governor signs into law "campus carry," I would think UGA will have to alter its rules some to allow faculty and staff the same access to concealed carry of a weapon that the law would allow for students who are over 21 and meet all the other requirements. That's a good question to ask the administration, I suppose, as if the governor signs the law it will go into force (I think) July 1.