Looking at Georgia, and only Georgia, let's peek under the hood. A few key points:
- Turnout sucked. Georgia had the 13th lowest turnout of registered voters, at last count, with about 34 percent participating. Democrats typically need high-turnout elections to be successful. The polls likely overestimated how many Dems would actually show up on election day, or possibly underestimated how many Republicans would vote.
- But there is no clear correlation between low turnout and high turnout states. For those states with under one-third of voters bothering to vote, the Democrat poll bias was 4.5 percent. For the rest of the states, it was 4.1 percent. Not much there, at least so far.
- Okay, how about the exit polls? Not being a member of the sponsoring news organizations, I don't have access (yet) to the raw data. Best I can tell, Nunn received only 23 percent of the white vote. Most observers figured she needed more like 30 percent to pull off a win.
As to why the polls were wrong ... lots of smarter people than I will weigh in on this. As Nate Silver wrote:
This evidence suggests that polling bias has been largely unpredictable from election to election. Beyond the shadow of a doubt, the polling was biased against Democrats in 1998, 2006 and 2012. However, just as certainly, it was biased against Republicans in 1994, 2002 and now 2014. It can be dangerous to apply the “lessons” from one election cycle to the next one.So, give it time to really understand why the polls anticipated some races to be closer than they actually were. When I have more time, I plan on breaking down all the Georgia polls to see which ones had it close, which had it wrong, and the degree they had it wrong. Or more likely, someone else will beat me to it.
One good start, though, is to look at this NYTimes report of Georgia exit poll data. Georgia had a slightly older, slightly whiter electorate than in 2010, the last mid-term election. More females voted than in 2010, but that did little for Nunn.