Monday, September 30, 2013

Science Quiz

Pew has a science quiz (to go with it's jillion other quizzes) that allows you to test your knowledge against the test itself and see how you stack up with others.

By the way, I got 12 of 13 right. Fear me.

Don't cheat, but here's the report based on a random sample. Scroll down and you'll find different ways they break down the results (gender, education, etc.). There are also questions asking people where they think the kids in the U.S. stand in terms of science knowledge. Interestingly, more get that wrong (44 percent say we're at the bottom in the world) than right (35 percent who said we're in the middle). An overly optimistic 7 percent thinks we're "top."

Thursday, September 26, 2013

Surprised Losers -- First Look at the Data

My present research project really has two primary questions:
  • Are losers in prez elections who were surprised they lost (expected their guy to win) less trusting of government and democracy than losers who expected to lose, or winners?
  • Two competing media hypotheses: (1) people selectively expose themselves to news that supports their beliefs, thus enhancing the "surprised loser" effect above; or (2) news consumption exposes one to polls that makes you less surprised you lost, thus depressing the "surprised loser" effect above.
So those two bullet points above, that's the gist of my study. It's a lot more complicated than that, and I'm using data from the 2000, 2004, 2008, and 2012 U.S. presidential elections, so it's a lot of variable recoding and wrestling and cursing. Especially the cursing part.

Okay, so how about a first blush of the results?

These are rough results. Really rough. And all they do is look at how winners in the elections (2000 thru 2008 so far, sorry) differ from two kinds of losers -- the expected losers, and the surprised ones.  The dependent variables are trust in government, the perceived responsiveness of government (cares about people like you think), and satisfaction with democracy. Oh, and whether the election was fair.

Political Trust -- Not much here so far. In 2004 winners had higher trust than surprised losers (but not expected losers). That's it.

Responsiveness -- Mostly working here. Winners higher than losers, but not a lot of difference between surprised losers and expected losers. Damn.

Satisfaction with Democracy -- Yes, working here, kinda. In 2004 surprised losers far below expected losers and winners. Woo hoo! In 2008, both kinds of losers below winners. Damn.

Election Fair -- Only two times was this asked, 2000 and 2004. In general surprised losers much lower than expected losers and winners. So a good start.

I've only begun toying with media variables and so far it's too early to say which hypothesis, if either of them, wins.

I'm putting off the 2012 data because it's still messy and I'm working to get an updated version, somewhat cleaner, before tackling the initial analyses. Plus I'm stumped by my media variables and exactly how to test my competing hypotheses above. 

Wednesday, September 25, 2013

The Tedium of Research

I'm working on this big research project spanning four U.S. presidential elections and I'm hip deep in that tedious yet nerdishly appealing stage of writing scripts to extract data, recode it, and run initial analyses. If you have no idea what that means, you're among the few, the lucky few.

I have a basic analysis of only 2000. I'm working on the same analysis of 2004. It's damn slow going -- and that's by someone who is really good at SPSS and wrestling with large datasets. Do not try this at home. No, really. Don't. UGA has a neat virtual lab setup that includes accessing SPSS from your home box, but after test driving the thing with a modestly-sized set of data, it's my recommendation you do not try this at home. Too slow, too clunky. Reminds me of my days on a 300-baud modem accessing the mainframe.

Monday, September 16, 2013

Reporting, Opinion, and Al Jazeera

There's a neat Pew analysis out of how Al Jazeera America differed -- or actually didn't -- from the other cable news outlets in how it covered Syria, the first major story since the network premiered. The other outlets are BBC America, CNN, Fox News, and MSNBC.

Mostly, Al Jazeera looks a lot like the other networks. One columnist made a big deal of this in USAToday but he misses the point, that on a huge story like this there are only so many ways you can report it, that where Al Jazeera tends to not look like its cable brethren is in its other stories. Fewer talking head programs, less partisan hackery. Give him points for stating the obvious, lifted from the Pew report, but no points for nuance and journalistic understanding.

You can even see a neat breakdown of reporting vs. opinion on the networks (it's the final graf of the study, reproduced below).

This is telling. CNN continues to dominate when it comes to reported news and Fox tends to come in last in the same category, or perhaps a statistical tie with MSNBC, but if so it's not by much. To say Al Jazeera looks like the rest is kinda missing yet another point. In the reported vs. opinion category, it actually does not look like CNN, and it is positioned to out-CNN CNN. That's the lede.

An N of 74 is a Survey?

I like to pick on bad surveys. Today's victim example comes from a newspaper just down the road -- Gainesville, Ga. -- that reports on a survey with an N of 74.  In other words, 74 respondents. Collected over a month and a half. From a web site.

Yes, 74. As in, if you do the math, at best an 11 percent margin of error. As in, useless.  Because the 74 aren't even random folks. It's a SLOP.

Here's the lede:
Two of six key projects in the Gainesville master transportation plan have solid support, while residents are divided over a project involving a new bridge over Lake Lanier, according to online survey results released by the city’s consultant in the effort.
So you're thinking okay, a dull but straightforward story. The lede's a bit long but not terrible. I'd grade it a B-, maybe a C+. Until the second graf, that is:
The survey involved 74 responses to questions posted July 1-Aug. 15 on the city’s website.
A whopping 74 responses? Over such a long period, from a city website? In the lede you tell me it's an online survey and I might be forgiven for thinking that it's a real survey, a scientific survey, one with a decent sample and credibility. Basically the second graf would be improved by saying "a completely useless piece of crap."

Indeed, read what the consultant says in the next couple of grafs. "Take it for what it is,” said a consultant. “It’s folks who chose to get on the Web page to give you a response. It is another form of input.”

That's consultant-speak for "this survey is complete bullshit, but it's kinda interesting bullshit, just another data point, but not a good one."

Journalists -- if it's bullshit, don't report it. Or if you report it, raise immediate critical concern in your lede that anyone would rely on this in deciding how to spend millions of hard-earned tax dollars. Instead the story goes at great lengths to report various percentages from the "survey" results.  Sigh.


Thursday, September 12, 2013

Are Religious People Patriotic?

Fun with 2012 data. Here's the question:
How well does the word 'patriotic' describe most non-religious people [Extremely well, very well, moderately well, slightly well, or not at all /  not at all, slightly well, moderately well, very well, or extremely well]?
The answer? No surprise, really. Those non-religious people, they ain't patriotic, at least according to a national sample of U.S. adults. In fact, 2-out-of-5 see non-religious people as slightly or not at all patriotic.  Below, the numbers (data weighted to be representative of the U.S. population).
  • Extremely Well -- 3.8 percent
  • Very Well -- 13.8 percent
  • Moderately Well -- 35.7 percent
  • Slightly -- 23.6 percent
  • Not at All -- 13.5 percent
Okay, but how about the patriotism of other groups? Glad you asked. We've got Protestants (7.9 percent "extremely well"), we've got Mormons (6.1 percent "extremely well"), we've got Catholics (8.5 percent "extremely well") and best of all, we've got Muslims (7.9 percent "extremely well," the same as Prods). 

That last one is interesting and heartening except, as you'll see below, not as much as you'd like.

The real differences seem to emerge in the "not at all" classification, the bottom of the patriotic pile. The folks seen not at all patriotic.  For Muslims, it's 33.8 percent. As a comparison, for Catholics, it's only 10.5 percent and for non-religious, it's 13.5 percent.  In other words, one-third of U.S. adults see Muslims as not at all patriotic, a number far and away above the other religious categories or even those damned-to-Hell non-religious folks.

By the way, on a separate question of all the above groups asking how violent they are, 7.9 percent see violent as describing Muslims "very well." About 1 percent see violent as describing Protestants "very well."

Al Jazeera

With a big splash, but with less audience, Al Jazeera premiered recently as the kinda sorta next big thing in U.S. cable news. The network vowed to focus on hard news, as opposed to the crap often seen on other cable networks (I'm talking about you, CNN, Fox, and MSNBC).

But what do people think about the network? Not much, if you believe this survey of over 8,000 respondents, and I especially like how they broke it down by state, including red states versus blue states, which you can pull off when your N is so big. Scroll down the page to find the maps.

Warning -- best I can tell, this is not a random sample. I can't tell at all how they got their respondents and that is, indeed, worrisome. That said, the numbers make sense.

Oh, you can see the raw data yourself on a Google docs page. Download it, if ya like, and crank away, though there's nothing there they don't show you on their graphs and maps.

Wednesday, September 11, 2013

University Rankings

The buzz this week is how my employer, the University of Georgia, cracked the Top 20 list of public universities, according to U.S. News & World Report. And rightly so. It's great news, looks good on a brochure, and UGA Prez Jere Morehead can slip it into Rotary Club speeches across the state.All well and good, but for fun let's look at a different list that also came out this week -- the QS World University Rankings.

First off, MIT is the top university in the world. Lemme say that again. In. The. World.

While being tops in the world looks even better on a brochure, I'm fairly certain our football team can take their football team. So where does

UGA's rank in the world? Glad you asked. Number 411. That doesn't work quite as well, screaming out "We're Number 411! We're Number 411!" at the top of your lungs in Sanford Stadium. Plus we're in a tie for that spot with a bunch of other places, a few of which I can find on a map.

Here's my point -- methodology matters.

In the U.S. News list, Princeton is the top unversity, but in the world list, the top spot belongs to MIT. Princeton finished 10th in the world. Lemme say that again: 10th. In. The. World. Not too shabby, but it does point out how two lists can differ because of methodology. The QS survey provides some details, the most "objective" of the lot being the quality of research citations, the number of citations per faculty member, and the H-index. As an aside, UGA's best world rank comes from citations per faculty (241st in the world) and the lowest in international faculty (760th). That's interesting in and of itself, admin folks. Something to think about.

The U.S. News list provides few interesting methodological details and for the good stuff you've gotta pony up some cash. Needless to say, we're talking about two somewhat different ways to measure stuff. Still, we all love a good list, even the lists we hate. A list organizes the world, gives us something to argue about, and makes for a quick story so journalists make it on time to Happy Hour.

Finally -- the QS world list is -- of course -- deeply flawed. How else could the University of Tennessee be ranked slightly ahead of us?

Tennessee? Puhlease.

Monday, September 9, 2013

Public Opinion About SCOTUS

In honor of the SCOTUSblog symposium today on the UGA campus, I give you a few public opinion tidbits. Turns out, some folks have little patience the U.S. Supreme Court.

Take this 2012 ANES question:

If the U.S. Supreme Court started making a lot of decisions that most people disagree with, would you favor, oppose, or neither favor nor oppose doing away with the Supreme Court altogether?
  • About 14 percent favor doing away with the court.
  • About 44 percent opposed such an action.
  • And 34 percent are undecided
For those who can add to 100, note that the rest of the survey respondents are scattered among such categories such as "don't know" or "refused" or an incomplete interview.

So if I were in newswriting mode -- me being the pessimist I am -- I might say the lede is:
Fewer than half of Americans oppose doing away with the U.S. Supreme Court should it start turning out opinions unpopular with most people.
These data are not easy to use (I'm a pro, don't try this at home), but I plucked this one from the most basic of descriptions buried deep in a codebook that's 1,870 pages long.  I can go even further with this question, as in how much do respondents "lean" toward doing away or preserving an unpopular court. Results below are rounded, with the same caveats as before in terms of adding to 100.
  • Strongly favor doing away: 9 percent.
  • Favor, not strongly: 5 percent
  • Lean toward doing away: 4 percent
  • Not lean either way: 25 percent
  • Lean against doing away: 5 percent
  • Oppose doing away, not strongly: 9 percent
  • Oppose doing away, strongly so: 34 percent
I suppose there's some comfort in the bulk of people fall either in the strong opposition to doing away with an unpopular court or in the "neutral" response.

If I took more time, I could identify the socio-demographic or political factors that predict who is more likely to favor axing an unpopular court. I'd guess more conservative respondents, radically so, but that's only a guess. Perhaps with some prodding I'll run the analyses.

Thursday, September 5, 2013

Kooky Conspiracy Theories

I love conspiracy theories, and I've published research that attempts to explain who are the folks who believe Barack Obama is Muslim (spoiler alert -- conservative Christians).

I thought you'd like this, then, a quick compilation of the percentage of Americans who believe in the kookier theories. These are the percentages of U.S. adults, based on a national survey, who are certain, are definite, in their belief. If we add in the "probably" then the number increases. Those are in parentheses. Yup, I've got the raw data.
  • Feds directed Katrina flooding into poor areas -- 2.5 percent (12.8)
  • Obama born outside the U.S. -- 5.6 percent (16.7).
  • Death panels are in Obamacare -- 8.1 percent (25.9).
  • Feds knew of 9/11 in advance -- 9.5 percent (26.4).
Look at those last entries, that 1-in-4 Americans believe Obamacare includes death panels and the feds knew in advance about 9/11 are either definitely or probably true. Wow.

I expect to do a paper on this in the next few months, one that gets into who believes in this (demographics, political factors, media use, etc.). It's an area so ripe for research, I half wish I didn't have two other things in the queue to finish first.

Note: Corrected the second-to-last graph to reflect it's 9/11 that 1-in-4 believe in, not Katrina.

I'm Not So Smart

I took the Pew Research Center's latest news quiz and got 10 of 13 questions correct, better than 85 percent of the public, tied with 6 percent. Not so smart. It's tough and tricksy.

My Fiction Phase

I used to write fiction. Specifically, short stories, because tackling a novel scares the crap out of me.This essay, a blog hijack moment from its usual theme, is to explain why I started writing fiction, why I stopped writing fiction, and why it matters.

Why I Started

Face it, we all have a novel inside. Me too. But hacking out 75,000 or so words with almost no chance at publication never appealed to me. There's less investment in short fiction -- though a good short story is harder in some ways than a novel. There are three reasons why I started my fiction phase:
  1. To let off creative steam after a long time writing non-fiction (journalism) and (even worse) academic writing.  
  2. To re-learn the craft of writing from a different perspective to help in how I teach journalistic style writing.
  3. To get rich and meet chicks. Actually neither of these are true. You don't make any real money writing short fiction and I'm already married.
Why I Stopped

The main reason I stopped is it's too damn hard. Oh, and at my very best, I'm a mediocre fiction writer. And the pay sucks. And I never met any chicks.

I did okay with my writing. I actually sold about 50 short stories over several years, almost all of them genre stuff (horror, fantasy, science fiction). I make no apologies for writing in the genre ghetto -- it interested me at the time and was a particular challenge. I even sold a few stories at what were considered "pro rates" and of all of them brought in a little bit of money for my troubles. One story was sold three different times as it got picked up by anthologies, etc. Remember this -- money from writing should always flow to the author. Never pay someone to publish your work.

During this period I read a lot of books on writing, very helpful books. My favorite is By Cunning and Craft, and although it focuses on fiction it's helpful for good narrative non-fiction as well. I learned a lot about writing during this stage.

Here's another aside. Publishing fiction is harder than publishing academic research. The top journals in my field accept maybe 10 percent of the manuscripts submitted, and I've landed a number of research articles in them, but the top genre magazines (never mind The New Yorker or The Georgie Review) accept maybe 1 percent of the manuscripts submitted. I never cracked the top fantasy or science fiction mags.

What's it all mean? I think I'm a better writer for having suffered through this stage, and believe me, writing fiction is all about suffering. I'm still a hack. I'm just a better hack than I was a few years ago. If writing comes easy to you, you're not doing it right.

It's also made me a better teacher of writing, or at least it's fooled me into thinking I'm better at teaching it than I was before my fiction phase. If I'm wrong, I don't wanna know. Leave me this crumb.

Wednesday, September 4, 2013

Obama is Still a Muslim

While messing with 2012 election data I visited an old friend, the question that asks respondents the religion of Barack Obama. I've published research on the topic, been quoted by major news orgs on it, such as this NYTimes piece. So I like to revisit it when time allows.

In the 2012 election, you'll be happy to know, misinformation thrives. Obama is Muslim, says 23.1 percent of U.S. adults in a national survey. But about a face-to-face versus a web-based survey? Respondents were randomly assigned to get one or the other. I'd expect in the F2F version, you might see fewer kooky responses. Turns out, I'm right.
  • F2F: 19.9 percent say he's Muslim
  • Web: 24.7 percent say he's Muslim
Five percentage points. That's probably statistically significant, if I took the time to test it. But it gets better. Among the choices was "not religious" as a description of Obama.
  •  F2F: 9.2 percent
  • Web: 16.8 percent
Interesting, except there's this -- "don't knows" on this question are significantly greater on the face-to-face version (29.5 percent) than the web-based version (3.6 percent) of the same question. I don't know quite what to make of this, and it's difficult to say whether the survey mode made any real difference given this odd "don't know" distribution. Perhaps people were uncomfortable with the question face-to-face and mumbled a "I dunno" more often, deflating the F2F numbers above. That's a reasonable hypothesis and actually testable by seeing how often they did or did not respond similar questions.

Maybe it's time for me to do another "Obama is a Muslim" study after all.

Tuesday, September 3, 2013

Word Order in Surveys

I was messing with data and came across this very simple wording experiment in the 2012 ANES time series survey.  Respondents were randomly assigned to get one or two versions of the same question in which they're asked if voting is "mainly a duty" or "mainly a choice," the difference being the order in which these responses were presented.  Did it make a difference? Kinda. And it kinda matters depending on whether respondents were also randomly assigned to the face-to-face or web versions of the survey. What I present below are weighted results.

First, here's the question itself:
Different people feel differently about voting. For some, voting is a duty - they feel they should vote in every election no matter how they feel about the candidates and parties. For others voting is a choice - they feel free to vote or not to vote, depending on how they feel about the candidates and parties. For you personally, is voting mainly a duty, mainly a choice, or neither a duty nor a choice?
That's the "duty first" version above (obviously). The "choice first" simply changes the order. Now, the results in which I combine them from both kinds of surveys:

                           Duty First   Choice First

Voting is a Duty        22.1            23.6
Voting is a Choice    21.2            19.7

As you can see, there's a recency effect of sorts. Giving the "voting is a choice" potential response first, followed by the "duty" response, results in a slightly higher "voting is a duty" answer.

Okay, but what about F2F survey versus one on the web? One argument might be that in a face-to-face survey, social desirability might lead one to be more likely to choose the "duty" answer regardless of response order. That does happen, and mostly on the web version, and especially on the web version in which "duty" is offered second.

Our takeaway? If there is one, it's that subtle difference do occur if you fail to randomize key questions, especially the order in which responses are provided.

I'll report more of these as I come across them because, dammit, they interest me if no one else.