Friday, October 31, 2014

No Pictures Please

It's the law. Don't take photos in the voting booth.

Simple enough, eh? Yet here we are, in the era of the selfie, so ya just know some self-absorbed schmuck is gonna pull out that smartphone and snap away -- and then share it on social media. Odds are no one will notice. Odds are nothing will happen, no one will care.

Unless, of course, you happen to be running for district attorney.

It's behind a paywall, but don't worry. I was quoted in the story and the reporter sent me a copy. I won't post it all here, but I'll hit the high points. Here's the lede by Amanda Thomas of the Douglas County (Ga.) Sentinel:
A photo of a ballot posted on the Facebook page of Douglas County District Attorney candidate Dalia Racine is reportedly being investigated by the Secretary of State’s office.


Racine put a photo of her electronic ballot on Facebook, along with a pic of her holding a "I voted" sticker. When the reporter called to ask about it, the photo, coincidentally I'm sure, disappeared (the reporter got a screen grab before it disappeared). But the campaign never returned the reporter's calls or commented. Dumb. She should just make a joke of it ("I just wanted everyone to know I voted for myself"), apologize, and move on. Clearly she has no campaign adviser, or at least no adviser with a clue. And, as I was quoted in the story:
“There is a great irony in the fact that someone running for the chief law enforcement officer of the county would violate the law in the voting booth,” Hollander said.
Now she's likely to be investigated for this, though apparently not by the incumbent DA due to a conflict of interest. Now that's a smart candidate.

If you're dying to see the story, I can email it to you, otherwise I try to respect a paywall.

Oh, the law itself, from the story:
O.C.G.A. Section 21-2-413(e) states, “No person shall use photographic or other electronic monitoring or recording devices, cameras, or cellular telephones while such person is in a polling place while voting is taking place.”

Yet Another Georgia Poll

A new Georgia poll is out, this one by Landmark conducted for WSB-TV. It shows:
  • Michelle Nunn tied with David Perdue, 47-47.
  • Nathan Deal ahead of Jason Carter, 48-46.
You can get more specifics about what I assume is a robo-poll, which is about the only way you can survey 1,500 "likely voters" in a single day. As such, use some skepticism when viewing the results. Landmark received a C+ from Nate Silver's pollster rankings given in part because it fails to call cell phones and is not part of the polling transparency programs. According to Silver's analysis, it also leans slightly Republican.

The survey reached 29.3 percent African Americans. Women made up 55.1 percent of the sample. Democrats made up 39.7 percent of the sample, Republicans 46.3 percent. It's unclear, given how little methodology is presented, whether these numbers reflect raw results or were statistically weighted after the poll was conducted. Yes, that can make a difference.

Thursday, October 30, 2014

Making Sense of the Senate Races

If you look at this compilation of comparisons about next week's U.S. Senate races, you'll see a consensus that it's going to be a good night to be a Republican. So, how do they stack up for Georgia's race between Nunn and Perdue?
  • Tossup -- three of the predictors rate it a tossup (Cook, Roth, and Sabato).
  • GOP Advantage -- All of the other seven rate it to some degree a Republican win, with the strongest being The Daily Kos at 85 percent and the weakest being the Princeton Election Consortium at 55 percent.
  • Democratic Advantage -- Um, no one in this group suggests such a thing will happen, though there are hints in the data that polls may be underestimating Democratic turnout.
Like many others, I see the Nunn-Perdue race as likely runoff as neither candidate (i.e., Perdue) getting over 50 percent.  The Upshot model puts it as likely a runoff will happen as well. As it reports:
Georgia has had five previous statewide runoff elections. There were two in both 1992 and 2008 — each time for senator and for public service commissioner — and one in 2006 for public service commissioner. In all five of those elections, the Democrat lost.
This, of course, does not bode well for Nunn even if she forces Perdue into a runoff thanks to a Libertarian candidate nibbling away with a percentage point here, a percentage point there.  Democrats simply don't turn out as well in Georgia runoff elections as do Republicans. Assuming Perdue doesn't outsource his campaign strategy, he should -- based on previous turnouts -- win a runoff. All bets are off should he insert his foot squarely into his mouth on some issue, or if real-world events bend the electorate in a Democratic direction.

It'll be really interesting if we end up with Georgia's runoff deciding the fate of the Senate. Local television and radio stations will get rich on the advertising.


The Douchiest Schools

UGA is the only SEC representative to the latest GQ list of "douchiest schools," coming in at #13. A plus? A minus? Hard to say, just as it's hard to say anything at all about how they came up with the list. It is categorized as entertainment and humor, after all, so ya can't take it too seriously.

There's a The Red & Black story, where I first learned of this year's list, or you can go through the list here and wait forever to work your way through click after click to see the rankings and soak up their advertising. Or you can just read them below and at the bottom see the entry for UGA.

The Douchiest List
  1. Brown
  2. Duke (always should be #1)
  3. Princeton
  4. Harvard (that's three out of four Ivy)
  5. Deep Springs (huh?)
  6. Bob Jones (well, yeah)
  7. Amherst
  8. Rollins
  9. Charter College Wasilla (again, huh?)
  10. Colorado
  11. NYU
  12. University of Phoenix (online douches)
  13. University of Georgia (go Douche Dawgs?)
  14. Arizona State
  15. Notre Dame
  16. USC (bust those Trojans)
  17.  University of Chicago
  18. Boston University
  19. Ohio State (should be higher)
  20. Morehouse
  21. Trinity (seems sacrilegious)
  22. Vassar
  23. Randolph-Macon
  24. Texas
  25. Virginia
As for UGA, you can read below their brief explanation. It makes absolutely no sense to me. I've been here 23 years and either I'm missing something or this list deserves #1 on the Douchiest Lists List. You tell me.




Guys & Dolls -- in UGA Majors

While doing something else with the data, I decided to look at UGA majors and their gender breakdown. In other words, what majors are mostly female, what majors are mostly male? My analysis is based on Fall 2013 data, the latest I have at my fingertips. Setting aside the majors with fewer than 10 people, the winners are ... well, a lot of 'em. These include all levels, from undergrad to grad.

Majors 100 Percent Female

Foods and Nutrition (MS)
Furnishings and Interiors
Reading Education
Special Education
Textiles, Merchandising & Interiors

As you can see above, these all-female majors are located in two colleges (Family and Consumer Sciences and Education). I doubt this really surprises anyone. Okay, what about the guys? Where are they?

Majors 100 Percent Male

Turfgrass Management

That's it, just one, at least among majors with 10 or more students. In the also-ran categories are Computer Systems Engineering (6.9 percent female), Physics (8.0 percent female), Electrical and Electronics Engineering (9.1 percent female), and Engineering (9.1 percent female). Wow, see a STEMish trend above? Of course you do, because my tens of readers worldwide are quick to grasp the data.

"Yeah yeah," you say, "Hollander always focuses on the negative. Are there any balanced majors?" Glad you asked.

Majors 50 Percent Both Gender

Animal and Dairy Science
Applied Biotechnology
Educational Psychology
General Business
National Resource Recreation & Tourism
Interdisciplinary Studies
Marine Sciences
Plant Pathology
Science Education
Social Studies Eeducation
Workforce Education

There's no real trend above that I can see. It's an interesting mix of science and education and even some business tossed into the mix.

If I had time, I'd look at trends over time. But as you may know the Georgia-Florida game is this Saturday and down here we start drinking early to properly prepare ourselves. It's our burden.



Wednesday, October 29, 2014

Latest Georgia Poll

Good story via the AJC on the latest Georgia polling. I'm going to focus just on the U.S. Senate race between Michelle Nunn and David Perdue. According to the poll (more details on it here):
  •  Perdue 49 percent
  • Nunn 41 percent
If you're a Perdue fan you're loving it. If you're a Nunn fan, not so much. But this is an interesting poll. It uses humans to call landlines and cell phones, the "gold standard" of polling. That's a plus. It only called 436 likely voters. That's a small N, so small that the margin of error is 4.7 percent. In other words, based on the margin of error:
  • Nunn's real number could be from 36.3 - 48.7 percent.
  • Perdue's real number could be from 44.3 to 53.7 percent.
 So the numbers overlap, meaning from a strictly statistical sense it's a tie. Still, you also look at trends and some of the more recent polls have all showed Perdue ahead, so the real question to me isn't so much who will get the most votes next week as it is whether Nunn can keep it to a runoff in which, historically, she's likely to lose anyway. But that's another post for another day, necessary only if there is a runoff.

Interestingly, this poll is 55 percent female and, even so, Nunn does poorly. I'm not sure exactly what that means, if anything. There's no racial breakdown provided, but the poll does a nice job of find likely voters. See below:
Georgia voters drawn from a list of registered voters who voted in at least one of the last four general or primary elections and indicate they are likely to vote in the upcoming election.



Slouching Toward a Curriculum

The UGA journalism department, now made up of the old j-dept. and broadcast news, voted earlier today on its new curriculum. I wrote about it some time ago and the latest version looks close enough to it to let earlier version be your guide. Basically it's heavy on skills classes, heavy on newsroom work, heavy on students getting exposure to lots of different skills, such as work in video. The old 3410 lecture-lab class, for example, will go away. Students will do capstone classes tied either to the broadcast of NewsSource or work to put stuff on its web site (plus, of course, still doing stuff for R&B, etc.). Specialties include investigative reporting and there are classes in data and coding.

There were two votes against it, perhaps due to the overly undergrad aspect to the curriculum and its lack of flexibility, both reasonable arguments. One of the negative voters asked three or four questions, but never offered explicit criticisms. The other "no" vote didn't say a thing. In full disclosure, I'm not as happy about the new curriculum as others are, but I think it's an interesting approach and worth a try. I worry we may be creating generalists, not specialists. But I may be wrong (words, by the way, you'd never hear uttered by our two "no" vote faculty who, best I can tell after all these years, are never wrong about anything. Nor do they ever laugh at funny stuff in meetings. Sheesh.).

Okay, so what's next? The package of classes is sent later today to the Grady College curriculum committee. Assuming no problems there, it'll go to the Dec. 10 meeting of the entire Grady faculty. Assuming no problems there (yeah, yeah, lots of assuming), it'll head "up the hill" to the university curriculum process.

We hope to have a new and improved curriculum in place by Spring 2015 2016. Nothing happens fast at a big university, unless of course the president or provost or athletic department really want it, then stuff happens really really fast. Funny how that works.






Tuesday, October 28, 2014

Topsy Turvy Georgia Senate Race

A fresh new SurveyUSA poll is just out and it flips the Nunn-Perdue U.S. Senate race. As they report:
One week ago, Nunn led Republican David Perdue by 2 points, 46% to 44%. Today, in a dramatic reversal, Perdue is on top, 48% to 45%, a 5-point right turn in one of the nation’s most high-visibility contests.
So, huh? Is this real movement? A statistical blip? They make a lot of that "5-point right turn" when, honestly, all of this is within the margin of error. There's some interesting stuff, though. For example:
Among women, where Perdue had trailed by 13 points and now trails by just 2. And among core Republicans, where Perdue’s 84-point advantage is the largest it has been in 7 WXIA-TV tracking polls going back to 08/18/14. There is movement to Perdue among seniors, where he now leads by 25 points. Worse for Nunn: among voters who tell SurveyUSA they have already returned a ballot, Perdue leads by 10 points.
That doesn't bode well for you Nunn fans. Yes, this is a robo-poll, but so was the previous one.  

If I have time, I'll dig deeper into the weighting and such to try and understand what, if anything, is happening.

When Research Goes ... Wrong

The problem with real-world experiments is, sometimes, they involve the real world. And the real world doesn't often fully, um, appreciate your efforts. Case in point, this political research study (via colleague Karen Russell). I may add my own comments later.

Update 1

I posted this to one of my listervs (AAPORNet), which is made up of folks who do political polling and public opinion research, and asked if anyone had any thoughts about it or knew much about it. Instantly had 93 hits on the link, but not a single response on the listserv. So far.

Update 2

Now up to 270 clicks, though some from another listserv of data journalists. Also links to responses to this that ably criticize the criticism of the field experiment.

Monday, October 27, 2014

Georgia Senate Race

How close is the U.S. Senate race between Democrat Michelle Nunn and Republican David Perdue? Check out HuffPo's polling summary in the last few days just on this race.



I've analyzed these recent polls already, but what sticks out for me is how well Swafford does in the AJC poll (6 percent) versus the CBS/NYT/YouGove poll (1 percent).  The two polls agree with a slight (within the margin of error) lead for Perdue, The CNN poll has Swafford with 5 percent of the vote and a Nunn lead (again, within margin of error), so it's hard to say what's going on here other than it's going to be a close race and will probably result in a runoff -- in which Perdue (I'm thinking) will have an advantage.

The highest Perdue has ever been in all Georgia polls is 50 percent (Insider/Advantage on 9/10-11) and the highest Nunn has ever been is 47 percent (multiple times).

Luckily the election will be soon. The only reason we have elections, after all, is to find out which poll was right.

Friday, October 24, 2014

Misinformation

It's a nice Friday, so briefly with little discussion I point to this study about how voters become misinformed. As the abstract notes:
Our analysis reveals that voters' values and partisanship had the strongest associations with distorted beliefs, which then influenced voting choices. Self-reported levels of exposure to media and campaign messages played a surprisingly limited role.
This is not unlike what I found in beliefs about Obama. Really, it's more to do with what people choose to believe. Exposure to even correcting information appears to have little influence except, some studies suggest, in the direction of believing even more strongly in the conspiracy theory or misinformation.

AAU & UGA

I've written before, first here, and then here, about UGA likely salivating over the idea of being invited to the prom -- better known as the Association of American Universities. The AAU is the snobby elite group of research universities in north America. I hadn't given it any more thought after my two September posts until today when, in a fit of boredom, I glanced at the University Council executive committee's proposed agenda and saw this routine item about an update for the university's "parental policies."  It opens with this line:
As part of improving the national standing of UGA and implementing policies that are more in line with AAU universities ...
I find that fascinating. And telling. Is UGA trolling for an invite? You can't ask to join, you have to be invited (see my earlier posts for how the process works). Just to check, I did a quick search of the UGA site for any mention of AAU or Association of American Universities. I didn't find any smoking guns, no other memos, no other instances of a vast AAU conspiracy afoot.

I should point out The Red & Black did a story on this in 2012. It's worth the read. And, perhaps, an update.

Nunn, Perdue, and a Plethora of Polls

A bunch of Georgia polls came out today. If you're confused, don't be. Yes, most of the recent U.S. Senate polls had the Democrat, Michelle Nunn, slightly ahead of David Perdue, the Republican. And yes, a "gold standard" AJC poll just released has Perdue slightly ahead of Nunn. But again, don't be confused. Basically these leads are all within the respective margins of error of the surveys, so call this race a statistical tie.

Before the AJC poll, the Huffpollster wrote:
IN GEORGIA: CNN MAKES FIVE POLLS FAVORING NUNN - A new CNN/ORC poll gives Democrat Michelle Nunn a 47 to 44 percent edge against Republican Rep. David Perdue. This survey is the fifth in the past week to give Nunn a slight advantage, with earlier polling ranging between 1 and 3 percentages points. Twelve of 14 previous polls conducted since early September had given Perdue advantages ranging from 2 to 10 percentage points.
Also, written before the AJC stuff, this good NYT piece.

So, has anything really changed?

I said "gold standard" before concerning the AJC poll. That means that humans telephoned respondents via landlines and cell phones, something many polls -- especially robo-polls -- fail to do. There are also subtle differences in the secret sauce of who is identified as a "likely voter," as well as statistical weighting. In other words, even good polls are expected to differ some. My own read is Nunn and Perdue is simply too close to call and it comes down to the "ground game" on election day, short some October surprise in the next coupla weeks.

Let's take a closer look at the AJC poll. A few key points you may not have seen:
  • Nunn leads 49-36 over Perdue among women.
  • For Perdue, the lead among men is 52-35.
  • Nunn leads 44-34 among young voters.
  • Perdue owns the white Protestant evangelicals, 72-16.
  • Perdue also does well among self-described "independents," 43-31.
  • Nunn dominates in metro Atlanta, Perdue wins the burbs.
  • Perdue leads among the wealthiest likely voters, 55-30. Indeed, it's the only income level he does lead in. Nunn leads the rest.




Wednesday, October 22, 2014

Gender at UGA

I was skimming data on gender breakdowns of faculty at UGA, doing college by college, and then for the hell of it ran the analysis for the UGA president.  As there's only one prez, the following funny graphic appears on your screen.


Joking about Ebola

Is it okay to joke about Ebola?

I'm not setting up a joke, I'm asking (seriously, really) whether it's okay, or when it's okay, to joke about something so serious. And, from a scholarly standpoint, why we make such jokes.

First, it's probably never okay to joke about Ebola on an airplane. Then again, it's never okay to joke about anything on an airplane. After 9/11, the federal government ordered the surgical removal of a sense of humor from all airline pilots. Just don't go there.

I skimmed the social science literature, looking for guidance. Best I can tell, extending the research I glanced through to Ebola, we may joke about this terrible disease to:
  • Cope. We do this with other diseases, and there's research to suggest joking about it helps, especially patients but also caregivers.
  • Reduce Fear. Like coping above, we joke about it like we whistle past a graveyard at night.
  • Bad Taste. Some people do it because it raises the ire of others. Call this the asshat effect. Or late-night host effect.
  • Bravery. If you can joke about it, it's because you're not scared of it and you want everyone else to know.
  • Healing. Kinda like others above, the notion that joking about something is the first step to recovery.
I'm probably missing some, but I was skimming the literature and didn't dig deep. Plus some of those above may can be collapsed into a single category.

Georgia Races

A new poll out this morning has the two big Georgia races (Governor and U.S. Senate) each in a statistical tie. Deal leads Carter, 45-43, and Nunn leads Perdue 46-44. The margin of error is 4.1 percent.

The SurveyUSA is a combo of robo-poll calling (landline) and online surveys of 606 "likely" voters. As it reports in the fine print:
This research was conducted using blended sample, mixed mode. Respondents reachable on a home telephone (70% of likely voters) were interviewed on their home telephone in the recorded voice of a professional announcer. Respondents not reachable on a home telephone (30% of likely voters) were shown a questionnaire on their smartphone, tablet or other electronic device.
A "gold standard" poll, in which people are called by live human beings at both their landline and cell phones, will be released later this week by the AJC. Pay attention to that one. 

Tuesday, October 21, 2014

SEC East vs SEC West

Everyone knows the SEC West kicks football ass this year and, until the Georgia-Arkansas game, it owned a clean slate over the SEC East. (Go Dawgs). Yeah yeah, so much for football. But what about SAT scores?

I used the data from this site to do a comparison. I dumped it into an Excel file, sorted by schools in the East and West. These are the lower numbers, the average that students in the lower regions of admission scored, the 25th percentile. As you can see from the site, just eyeballing the data, the East schools seem to do better. If you sort the teams by region and average the scores, you get:

West: 500 Reading and 519 Math
East: 557 Reading and 569 Math

An advantage, for you math non-majors out there, of 57 points for the East in Reading, and 50 points in math. The differences are a bit less stark if we look at the 75th percentile, but they still favor the East by 43 points in Reading and 40 points in Math.

"But wait," you might say. "That's not fair. Vandy is in the East. They can actually read and write there."

Good point. So I excluded Vanderbilt and there's still an East advantage. Without Vandy, the East outscores the West on the 25th percentile Reading by 35 points and the 25th percentile Math by 26 points. At the 75th percentile level, the advantage to the East is 25 points for Reading, 21 points for Math.

So maybe the West is kicking ass in football this year, but it only takes a friggin 470 in Reading to be in the 25th percentile at Mississippi State, those other Bulldogs. The 25th percentile table is below.



Team Reading 25% Reading 75%
Alabama  500 620
Arkansas  500 610
Auburn  530 630
LSU  500 610
Ole Miss  480 600
Texas A&M  520 640
Mississippi State  470 610
Average West 500 617



Florida  580 670
Georgia  560 650
Kentucky  490 610
Missouri  510 640
South Carolina  540 640
Tennessee  530 640
Vanderbilt  690 770
Average East 557 660



East vs West Diff 57 43



East Minus Vandy 535 642
Minus Vandy Diff 35 25


Monday, October 20, 2014

Methodology Matters

It's all in how you measure stuff. Yes, friends, methodology matters.

Take this list for example. In it, I'm happy to report, UGA's graduate journalism program is ranked #5 in the country. That's very cool. No doubt we'll plaster it on the web site, toss it out on Twitter, and buttonhole random strangers in the parking lot to tell 'em the news.

Okay, but what about our PR graduate program? Our PR program (don't tell them I said this) is very likely the best, or among the three best, in the country. By any measure. So how'd it do? Check out the PR list here, or just allow me to tell you it's not ranked. At all. It has:
  1. Georgetown
  2. Rowan College
  3. Mississippi College
  4. Florida A&M
  5. Miami (Fla.)
And so on to the top 15, which if you know anything about the best PR programs you'd have no choice but to say, WTF?

So we return to the question, how did they measure this? What's their methodology? Let's look at the fine print.
Graduateprograms.com reaches current and recent graduate students through scholarship entries as well as social media platforms. These program rankings cover a period from September 1, 2012 to September 30, 2014. Graduateprograms.com assigns 15 ranking categories to each graduate program at each graduate school. Rankings cover a variety of student topics, such as academic competitiveness, career support, financial aid, and quality of network.
Okay, we're clearly not talking a random, or every anywhere near random, sample. Scholarship entries? Social media?

What's interesting about the journalism list is there are no real surprises in it. It looks okay. Sure, it's missing Mizzou and Berkeley, but no odd programs pop up.

So what's happening here? My hunch is a lot of students conflate "journalism" and "public relations" and UGA's journalism program ranked higher than I might have expected, at least at the graduate level. My other hunch is the reliance on scholarship entries may bias the sample toward smaller, hungrier programs. It's impossible to say, but as always take these rankings for what they're worth -- fun, interesting, and good if you happen to come out on top.



Friday, October 17, 2014

I Take Credit

I take credit for apparently having killed the bad polling practices of our student newscast. It's been since Oct. 8 that the j-students posted a revised god-awful pseudo-poll on the Grady Newsource site. They seemed to do one of these things every week. Until now.

I wrote at length about their reporting of bad, self-selected polls. You'll find my rather colorful language here, and then here. There are others, but you get the idea, and from them you can work your way back to a more technical explanation of why such SLOPs suck and represent bad, misleading journalism. What's odd and a bit troubling, though, is not a single j-prof who oversees the newscast, nor a student who actually puts it out, came to talk to me. I did get a weird phone call, mentioned in one of the posts linked to earlier in this graph, but that's it. I am possibly the most up-to-date faculty member in the building when it comes to polling. Hell, I teach our graduate-level public opinion class. Plus I've taught classes in public opinion reporting.

If nothing else, perhaps I've killed this practice. I try to watch the newscast every day, plus I always check the site and, especially, follow Newsource's excellent Twitter feed. We'll see. Yes, Newsource, I've got my eye on you.

Thursday, October 16, 2014

What People Know ... about Ebola

Kaiser has a new poll out that includes asking what people know about Ebola. The graphic sums it up, and the results? Not comforting.


Also see the report's Table 1, which looks at a set of questions and the education level of respondents. As you'd expect, the greater the education the more accurate the responses to health questions about Ebola.

Wanna Be a Department Head?

The University of Florida is seeking a chair of its journalism department. I'd usually not bother writing about this, but UF is my alma mater (masters and PhD, finishing in 1991), and thus it's a program I always watch. They ran this search a few years ago and Wayne Wanta, who has worked nearly everywhere, got the gig. He's stepping down, so a new search is under way. Here's an interesting bit of the job description:
A master’s degree is required for this 12-month position. 
Not a doctorate, just a masters, and "the successful applicant will (1) hold the rank of professor or meet the University of Florida’s criteria for full professor upon hire and (2) be eligible for tenure upon hire." In other words, either be a full professor or have a significant enough background to justify that rank. A bit unusual for a Research 1 university like UF, but not so unusual in a journalism department.

I could not for the life of me find the job description from a few years ago, so I can't say for certain whether it required a doctorate for the job back then. If it previously did require a doctorate (and all of the finalists for the job back then held one), then I wonder whether this job description is written with someone in mind, perhaps someone already in the department. I have no idea, and it's the dean* who ultimately decides these things. Coincidentally, she has a bachelor's degree -- from UF -- so it's hardly a surprise that she'd open up the search for a department head to include someone without a doctorate. The professional and academic fields are changing.



* in full disclosure, the present UF j-school dean,
Diane McFarlin, was my boss while I was a reporter
at a Florida newspaper.

Wednesday, October 15, 2014

Movement in Georgia U.S. Senate Race?

A post today at the AJC, based on a new poll, suggests there may be some movement toward Democrat Michelle Nunn against Republican David Perdue.

As they write (bold face by me):
Some caveats: The poll is a mixture of auto-dialing and online responses that showed Jack Kingston with a huge lead in the U.S. Senate primary runoff, but what matters here is the movement. A week ago the same poll had Perdue ahead by one percentage point.

I beg to, slightly, differ. It's dangerous to over-analyze "change" between two polls using the same error-prone methodology. Essentially, you have two robo-type polls with "change" in both instances lying within the margin of error. In other words, I'd argue there's not much change at all. They remain in a statistical tie.

But -- this is important -- while there isn't much change, that's no fun to write about. That doesn't sell papers, or get clicks, or draw viewers. So of course this gets more attention than it probably, mathematically, statistically, deserves.

You can see more of the poll, with crosstabs, here. You can see the earlier poll here. A few interesting differences emerge in the makeup of the two samples. For example, the previous poll had 30 percent black. The more recent poll has 27 percent black "likely voters" in its sample.

These polls call landlines or reach smartphones to provide a questionnaire. No live interviews, no humans talking to humans (the gold standard of polling). That said, at least they're trying to reach people other than via landlines, though it's preferable to call cells (using humans, as it's illegal to robo-call cells).

Are there fundamentals that favor a Nunn upset? Not really. Still, it's possible, for instance, for very technical reasons, many of the polling models are underestimating black turnout by a percentage point or three. The reasons are nerdy and PhDweeby, and I don't want to spend pixels explaining, but it has to do with Census weighting and the use of 2012 data to estimate 2014 turnout. Those teeny tiny percentage points, however, can make all the difference in the world, in a close race. Would I bet on Nunn? Nope, not straight up, but if you give me 7 points, I'll take some of that action.





Tuesday, October 14, 2014

Who Will Win the Election?

I'm working on a longish paper to possibly be presented in a few months on my new favorite topic -- surprised losers in elections. These are folks who expected their candidate to win an election, but who actually lost, and  the consequences of being surprised by the outcome.  In preparation, I'm looking at presidential elections from 1952 to 2012 and who people predicted would win the election.

Below, a sneak peek. The blue bars represent respondents who favored a Democratic candidate who also predicted that candidate would win. The red bars do the same for those who preferred the Republican candidate who predicted that candidate would win. As you'd expect, in runaway elections the gap is wide (look at 1972, for example). In closer elections, both expect their preferred candidate to be victorious (2000 being a good case study). I've got a lot more to do with this rather large data set, but this gives you a hint of where I'm going.  Plus I have to go back and validate the data some more to ensure nothing weird is going on, but just eyeballing it -- all looks okay.




Friday, October 10, 2014

Class Size at UGA


How have class sizes changed at UGA over the years? Check out the table below. Look especially at the Lower Division classes with 11-20 students enrolled. Why the jump from 2009 to 2013? My theory is this reflects all those Freshman Odyssey classes, at 15 students per class, that UGA put into motion a few years ago.

The concern, though, is usually those mass mob huge classes. As you can see below, the 200 students and higher classes went down among Lower Division classes (I'm guessing 1000-3000 classes, such as jour3410) and up only slightly among Upper Division classes. So we're seeing no huge bump in those gigantor classes that fill 300-student room in the MLC or, if desperate, Sanford Stadium. It looks to me we're seeing growth in mid-range classes, such as the 31-40 student size among Lower Division classes.


Course & Class Size 2009 2010 2011 2012 2013 Change
Lower Division 1 - 5 61 51 51 50 45 -16
Lower Division 6 - 10 95 76 67 80 91 -4
Lower Division 11 - 20 595 641 829 839 813 218
Lower Division 21 - 30 660 665 720 668 641 -19
Lower Division 31 - 40 237 245 274 269 334 97
Lower Division 41 - 45 52 47 50 59 45 -7
Lower Division 46 - 60 38 58 43 49 40 2
Lower Division 61 - 99 64 71 65 54 56 -8
Lower Division 100-199 55 56 60 58 60 5
Lower Division 200 up 75 62 65 71 67 -8
Upper Division 1 - 5 199 214 248 239 245 46
Upper Division 6 - 10 99 92 104 130 141 42
Upper Division 11 - 20 362 396 393 411 409 47
Upper Division 21 - 30 233 241 264 237 257 24
Upper Division 31 - 40 161 149 169 157 161 0
Upper Division 41 - 45 55 75 63 65 63 8
Upper Division 46 - 60 97 84 71 76 68 -29
Upper Division 61 - 99 67 63 64 58 69 2
Upper Division 100-199 31 30 39 43 47 16
Upper Division 200 up 17 20 21 19 20 3


Data Source: UGA Office of Instructional Research

The Rudest Drivers

I love funky polls. Here's one that attempts to list the rudest drivers by state. Because the web page is a bit flaky, I've recreated the table below. Idaho of all places is considered the rudest and is most hated by people from Arizona. The fact the two states don't border baffles the hell out of me, but let's move on.

Georgia (where I live) ranks 25th in rudeness, and Florida folks find Georgians the most rude. Not coincidentally, Florida ranks 23rd in rudeness and Georgians find Floridians the most rude. The two states also disagree on important matters like football and water, so driving is no surprise.

Finally -- Hawaii drivers are hated most by people in Kansas? WTF?


The states with the rudest drivers -- and the states that hate them most

Rude rank State Most hated by drivers from:
1 Idaho Arizona
2 Washington, D.C. Maryland
3 New York California
4 Wyoming Montana
5 Massachusetts New Hampshire
6 Delaware Georgia
6 Vermont California
8 New Jersey New York
9 Nevada California
10 Utah California
11 Alaska Arizona
12 Louisiana Texas
13 Connecticut New Jersey
14 Rhode Island Massachusetts
15 Iowa Illinois
16 Oklahoma Texas
17 California Texas
18 Alabama Georgia
19 Arkansas California
20 Mississippi Tennessee
21 Colorado California
21 New Mexico Texas
23 Florida Georgia
24 Ohio Kentucky
25 Georgia Florida
25 Illinois Wisconsin
27 Texas California
28 Hawaii Kansas
28 Kansas Missouri
28 Virginia North Carolina
28 West Virginia Pennsylvania
32 Kentucky Ohio
32 Maryland Pennsylvania
34 Arizona California
35 Michigan Ohio
36 Indiana Illinois
37 Pennsylvania New Jersey
38 Tennessee Alabama
39 Missouri Kansas
39 South Carolina North Carolina
41 South Dakota Texas
42 North Carolina South Carolina
43 Washington Oregon
44 Nebraska Pennsylvania
45 Wisconsin Illinois
46 Oregon California
47 Minnesota Wisconsin
48 Montana District of Columbia
48 New Hampshire Minnesota
50 Maine Maryland
51 North Dakota Michigan
Source: Insure.com, based on a survey of 2,000 drivers in July 2014.


#FreeGurley




#FreeGurley

 
and let us unite against our common foe -- the NCAA







 

Thursday, October 9, 2014

At Least It's No Longer Scientific

Finally, a small victory for the corrupter. As you may or may not know, I've been bitching about the misuse of polls by GradyNewsource, the most recent post yesterday (read it, especially the part about me being a corrupter).

At least now, the (oxymoron alert) "anecdotal" survey is at least noted to not be scientific, at least on air.

"Again, this is not a scientific poll ..."

Here's a journalism rule: if it's not, don't report it. Anyway, follow the link below to hear the brief story from yesterday's broadcast.
 
 Listen here.






Wednesday, October 8, 2014

Corruption is Me

As my tens of loyal readers know, I've been on a mission to pummel the NewsSource students into submission when it comes to running their online poll, better known in the industry as a SLOP. You can read my most recent post here, including a response to their tweet this morning (since, I think, deleted without explanation).

I replied to this morning's tweet and pointed to my blog, and had an odd phone call a little while ago asking if I was the tweeter who blogged the tweet, or some such thing. I said yeah. Now I know why.

Below is what you see for the "revised" survey.


Journalism is about, in part, transparency. To say "our previous anecdotal survey that we released was corrupted and recorded multiple responses from a single respondent" is a questionable, if not ethically-challenged way to respond. Let me help you.
  1. Your system allowed multiple responses.
  2. After voting, your system encouraged multiple responses by giving the option to "Submit Another Response." Think about that.
  3. You have now added "anecdotal" to the survey. Dudes, it was always at best anecdotal, and at worst misleading. Please, look up SLOP.
  4. You did not fully explain why there is a change. That's not treating your audience fairly. Ya know, transparency.
  5. And you suggest I corrupted a survey that you set up for multiple responses in the first place. Yeah, I'm a bit pissed. But more, this is about as ethically challenged a way to handle this as is humanly possible. I mean, WTF?
  6. But I am happy that now people cannot easily respond with multiple answers, using Google. Good. But still it's a SLOP, so DO NOT REPORT SUCH RESULTS IN A NEWSCAST OR SITE
I can't wait to hear about my corrupting influence in the 5 p.m. newscast. Maybe later I'll put on my trench coat and go hang around a playground.







The Telling Detail

I preach, until my students are sick of hearing it, about how important it is to note the telling details when reporting a story. So let me set the scene. It's my intro to reporting lab and I'm giving a 15-minute speech so they can practice their speech writing skills before going out to do live stories. I'm talking about some research I did on presidential elections.

While I'm talking, I'm holding this mug. It's full of presidential election slogans. Only one student noted in her speech story that I was drinking from a presidential slogan mug while talking about presidential politics. Lesson learned -- at least by one student. Sigh.


Sigh. More Bad Poll Stories

- UPDATE BELOW -

I've bitched about this a lot, most recently here, about bad use of poll data in a news story. Sorry, but I have to pick yet again on the students putting out the Grady NewSource broadcast/site. I'll keep pummeling them until they stop, much like I did The Red & Black many years ago when they fell into the SLOP trap.

If you visit the NewSource (buy an "s" dammit) site and click on this story, you get the following screen:


Click on the link to express your opinion and this comes up:



Allow me a few words from a j-prof who teaches writing with numbers, data journalism, and a graduate class in public opinion. Plus it's my blog, so I can say what I damn well please. Where do these issues come from? Why these five? What's missing?
 
Okay, so you choose the issue that concerns you the most and when ya finish, there's this page (see below). Notice the "Submit another response?" Yes, that's right. They're encouraging people to vote again and again and again, bringing fresh meaning to the "vote early and often."


By the way, I've voted about 25 times so far. Just to make a point.

It's okay to run these SLOPs for fun, but it's never okay to run their results in a news story. But if you do (and you shouldn't. ever), you should preface them by noting the results are non-scientific and complete bullshit, but we're going to take up precious broadcast time anyway reporting them.

Will I ever stop complaining about this? Sure. When they stop running these polls and reporting the results as if they're news. They're not.

UPDATE

A tweet went out moments ago giving the link to the poll results so far. See the image below.


Here's the bad news. At least 25 of those 38 responses for "Government Reform," they're from me. Over and over and over again. I kinda lost count. They may all be mine.

Why would I do this? To make a point, that you're over-reporting complete bullshit as news. Do. Not. Do. This.


Tuesday, October 7, 2014

Who's Ahead in Georgia?

A new poll is out on the Georgia gubernatorial and U.S. Senate races. According to the PPP robo-poll of "likely" voters:
  • Republican David Perdue leads Democrat Michelle Nunn, 45 percent to 43 percent (potential runoff as the Libertarian candidate is pulling in 5 percent, according to the poll)
  •  Republican incumbent Gov. Nathan Deal leads Democratic challenger Jason Carter, 46 percent to 41 percent.
Again, this is a robo-poll, so be skeptical, especially as PPP got only a B- for its efforts from statistical guru Nate Silver. According to the methodology (all AP Style errors are theirs, not mine):
PPP surveyed 895 likely voters from October 2nd to 5th. The margin of error for the survey is +/- 3.3%. 80% of interviews for the poll were conducted over the phone with 20% interviewed over the internet to reach respondents who don’t have landline telephones. 
Silver's lousy grades for PPP come in part from being off in its numbers but also for a lack of transparency in its methodology and, especially, not calling cell phones. It's hard to say whether the data were weighted in any way.

No matter what you think of the PPP poll, there's one that's much much much worse.

Monday, October 6, 2014

Changes in Polling

If you're into polling, or understand its role in democracy, read this. It won't take long.

Vote Early and Often

I won't belabor the point I made several times, most recently here, about how bad an idea it is to report on such polls:

Sunday, October 5, 2014

Ebola

There's no reason to freak out over Ebola. It's hard to catch, hard to spread. A good list of 15 things to know about it is here. Or you can take the test below:


 Let's look at how attitudes about Ebola in the U.S. have changed. Warning -- some of these are different questions, so compare at risk.

August 2014: A Harvard School of Health poll had 59 percent "not concerned" about an outbreak in the U.S. One-third said Ebola was spread "very easily" and another third said "somewhat easily."

September 2014: A CNN poll  had 34 percent "not too worried" and 39 percent "not worried at all" about contracting the disease.

So not a lot of change. Yet.



First Fall Freeze


It was near 40 degrees when I got up this morning. That's chilly for early October, but it's not even close to the coldest on that date. But let's talk earliest freezes. According to the data here, the historical earliest Fall freeze in Athens is Oct. 9, 2000. That's my news hook, as Oct. 9 will be this Thursday. That said, if you skim that web page, you'll see most of the earliest freezes occur later in October, or in November, or even as late as December 6 in one year. The average date for Athens' first freeze is Nov 7, so don't let this bit of cool air cause any panic. All is well.




Thursday, October 2, 2014

News Quiz

Pew's latest News Quiz is out. Test your current events knowledge. Be humbled. I got 10 out of 12 right, better than 92 percent of Americans.

Wednesday, October 1, 2014

No. No. No.

Maybe I'm being persnickety. An annoying methodologist. An asshat. But as someone who teaches graduate level public opinion and who is a poll nerd, I hate it when professional journalists screw up a story based on survey results. I hate it less when students screw up. They're learning, after all. And the students did a helluva job on deadline following up on some live, breaking stuff involving a local ATF shooting. Kudos for that. Nicely done.

So today, GradyNewsource reported the results of a completely bogus and bullshit online poll about UGA's tobacco ban. Here's a brief online version of the results, but I can't point to the video package. The students had ample warning. I first blogged about it here on Tuesday, and then warned them again today here.

From Newsource:
ATHENS, Ga- We asked. You answered.  The tobacco ban goes into effect on UGA’s campus today. What do people really think about this ban? More than 50 percent of people think the ban will not be effective. More than 75 percent think health will improve because of the ban. 65 percent of those polled believe the tobacco ban will not be sucessfully (sic) enforced on campus.
Yes, they successfully misspelled successfully.

On air, the reporter stood in front of a a set of bar charts outlining the results. But -- he offered not a single explanation of how the poll was conducted, when it was conducted, or a margin of error, or how many people even participated. This misleads the audience into thinking it was a legitimate, scientific poll.

I don't want to regurgitate my previous posts. Go read them to understand SLOPs and why they suck, and why journalists should never ever rely on them for a news story.

Even a real poll should have its methodological details revealed in the story. These are the basics. Hell, they're the bare minimum. But for a SLOP they're meaningless because a self-selected opinion poll, quite simply, has no meaning. There's no randomness to the sample. People can vote multiple times. I know. I filled it out four times, and someone else told me they did it 20 times.

Again, in fairness, these are students learning the craft. And these students never had me for jour3410 in which I carefully explain about polls, how to report them, how to know a good one from a bad one, and how to never trust a SLOP.



Message Received?

I complained, at length, earlier this week about a Grady Newsource online poll that asked students for their thoughts about the new UGA campus tobacco ban. Read my bitching here. In a nutshell, such online polls are less than useless and should never be reported as news. I even noted how, when you finish the poll, it included a link so you could do it again. I'm up to four responses myself.

I never actually heard from any of the student journalists or the profs who advise them, but my message may have been received. Or it may be pure coincidence. How so?

Below is the original end-of-poll screen after you completed the questionnaire, complete with a "Submit another response" link just in case you wanted your opinion to really really matter. Below that is the screen you see today if you complete the poll. See the difference?























Again, it could be coincidental, but I'm not big on coincidences. I suspect someone at least made it so it's not so easy to bias the results and easily vote again. And again. And yet again. Of course, if you want, the poll is still online and you can bias away if you so choose.

Today at 5 p.m. I'll try to be near a TV (Channel 181 on Charter cable) or online at the site linked to above to see if they use the survey results in a story. I try to watch every day. Of course they shouldn't use the results in a newscast, but if they do I hope it'll be prefaced by saying:
"In a completely meaningless and scientifically bogus online poll, survey respondents said ..."
But I wouldn't count on it.

And please, don't say it's student opinions. You have no way of identifying the respondents. My cat could be filling it out, again and again. Or the protesters at Tate Plaza who dislike the ban could go on and really bias the results in their direction. If I were them, I'd do it.