It's an accepted truism in the political science literature that survey respondents overestimate how often they vote. Studies in the past have attempted to gauge this by checking the voting records of survey respondents, with some success.
(as a matter of information for you budding public records access scholars, you can't tell how someone voted, but you can tell if they voted, at least in most states)
There's a new report out at the ANES web site in which the authors examined the voting records of their 2008 national survey sample (report pdf here). In surveys, the proportion of folks who say they voted is notoriously higher than the actual proportion of people who cast a ballot -- hence we know some folks are, to put it nicely, fudging. Or so we thought. See below.
In examining public records, the authors reported four findings:
- Official government records "contain numerous errors" that makes voter validation difficult.
- These errors differ widely from state to state.
- How states report their data makes it tough to compare across states.
- And here's the biggie -- "We found that for respondents whose government records can be identified, the records and self-reports show very high levels of agreement. This finding implies that overestimation of turnout rates by surveys is attributable to factors and processes other than respondent lying."
This is comforting for scholars who use survey data to study voting turnout, though it kinds makes my dad's advice from above a bit out of date.