Wednesday, July 19, 2017

An Odd Survey

So killing time on the UGA campus, I saw this flier.

So I checked it out. You can too by going here. There's no mention on the flier or on the survey itself who is sponsoring this thing, what's it going to be used for, or anything at all, at least not at the beginning. So I did the survey despite not being a student.

First off, you get a set of "I believe ..." statements and you see a Likert-type response ranging from strongly disagree to strongly agree. Here are some examples of statements:

  • I believe that it is important to talk to others about societal systems of power, privilege, and oppression
  • I believe that it is important to help individuals and groups to pursue their chosen goals in life

That's just from the first page, but you get the idea. Page two gets a little stranger, asking how relevant to your thinking. By page three we return to statements for agreement or disagreement, such as this one:

It is better to do good than to do bad.

And after that we get into classroom stuff, which is fascinating, especially as it asks how you'd consider a professor criticizing you on opinions about gender, sexuality, etc.  It's a rather long page of statements. Near the end we get the standard demographic questions, such as age or major.

Given the UGA logo, I can only assume this survey was sponsored or conducted by a UGA office, but it's hard to tell. The sponsor should be listed. It's not even clear if this survey went through human subjects approval, as their no information about that, or an informed consent.

Very odd.

Monday, July 17, 2017

The Expectations Game

I'm always fascinated not so much by who people say they're going to vote for as who they think is going to win. Generally, people say their preferred candidate will win and you can see that below. I'm playing with some 2016 election data. Of those who said in the pre-election survey they preferred Clinton, 96.2 percent predicted she would win. Of those in the pre-election survey who preferred Trump, 75.6 percent predicted he would win. For you statistical nerds out there, that's a X2 of 1399.7, p<.001. In other words, a huge association. See the table below for a summary.

Who Will Win

Who Voting For


I excluded the handfuls of people who preferred or said they would vote for Gary Johnson and Jill Stein. Their numbers are too small to matter, at least when comparing the preference-expectation link. 

We call the above findings wishful thinking, a body of research that consistently demonstrates that people tend to believe their sports team or candidate will win, even when that candidate or team is behind. 

When time allows I'll look at the predictors of wishful thinking in the 2016 election. Who were the 52 Clinton supporters who predicted Trump would win? How do they differ, if at all, from the 1,333 who said Clinton would win? And vice versa for Trump. We do know that affect, as in emotion, plays a huge role. The more you care about an outcome, the more likely you are to engage in wishful thinking. Education and knowledge tend to, at least somewhat, make people more accurate. The role of the media is kinda mixed. In my analysis of 2012 data, I found that watching partisan news, such as Fox News, made you more likely to inaccurately predict Mitt Romney would win, even after controlling for lots of other factors such as caring about the outcome. It'll be fun to see if watching MSNBC, for example, has the same effect on Clinton supporters in 2016 that Fox had on Romney supporters in 2012.

Wait. I do have one quick analysis to share. The only MSNBC program in the data, that of Chris Matthews (Hardball), there seems to be an effect. Among those who watch Hardball and supported Clinton, not a single one predicted Trump would win. For those who didn't watch Hardball and supported Clinton, 4.1 percent predicted Trump would win. That's not a powerful effect, but it is suggestive. There's a similar result for watching CNN's Anderson Cooper, but not quite as strong.

Again, when I have time I'll build a multivariate model and see what separates the accurate from the inaccurate.

A final word. The overall expectation in the numbers above was that Clinton would win -- and she did -- at least in terms of the raw popular vote, and that's what this poll reflects. Most respondents are not doing a state-by-state Electoral College analysis in their heads when answering a survey question like this. The same happened in 2000 when George W. Bush lost the popular vote to Al Gore but won the Electoral College. The ANES data that year had more folks expecting a Gore win. And he did, in terms of the popular vote. A more nuanced analysis here would include the state the respondents live in and whether they were accurate predicting their state versus the national outcome. Yes, I have those data.

Data source: ANES 2016 survey

Thursday, July 13, 2017

Online Polls

I've been slow to get to this, other than a Twitterspat on the topic of polls. The Red & Black ran an online poll about campus carry. The results are below.

I'm a well-established of online polls and I don't want to repeat my disgust with SLOPs any more. I'm tired of doing it, tired of no one listening. Let's face it, I'm pooped. So it's a crap sample of a few hundred people who happened to stumble across the web page and felt the urge to click a button and vote. By the way, the results are slightly different now as the poll is still up, so feel free to vote yourself. I voted indifferent because, dammit, I always pull for the underdog.

Other than a lousy sample, I'm also not too sure about the metric here, what we in the public opinion biz call the response alternatives. In other words, the answers allowed. There are four, two of them negative, one of them positive, one of them, well, indifferent. Does this make sense? Not as presented. Angry could be angry about passage, angry about the restrictions on carrying a concealed weapon (no faculty offices, no dorms, etc.). So we can't say its a continuum from Pleased through Concerned to Angry, not in the way the results are presented. I might have labeled two answers for each positive or negative, or some such. One way to interpret these results is 57.2 percent don't like it (angry or concerned) and just over a third (37.9 percent) are okay with it. Another way to interpret it is more people are pleased than they are angry or concerned. Feel free to spin it any way you want, because the sample itself makes the results unimportant.

Given this is kinda old now, I'm not even gonna tweet a link here. I pick on the R&B enough, as is.

I Promise, Nearly Done

And now the fourth post on UGA parking ticket data. Yes, you're sick of reading about it, but I'm not sick (yet) of writing about it. In previous posts I've looked at how many tickets were written in the 2016-2017 academic year, what lots were more likely to have tickets written, and even what color car is most likely to get a ticket.

Today, what charge is most likely to occur.

On my list there are 25 possible reasons a ticket may be written, plus a 26th that is blank with three tickets written for that, probably a data mispunch or perhaps some special case or, most likely, they forgot to put the charge in the database while writing the ticket. Anyway, below you'll find a rank order of the various charges and the count. Dominating the list is "no parking permit," with 15,184 of the total 27,439 tickets written that year, or 55.5 percent if you like it broken down that way. In other words, over half of all tickets are given to cars that didn't bother to get a parking permit in the first place. Second on the list is "out of zone/region" which means, best I can tell, folks who had a permit for one lot but decided they could park in a different lot just because they're special. Some of the charges I'm unsure about, like "theft of parking services."  Also, who the hell parks in a fire lane? Turns out, 37 such tickets were written. And who the hell, especially, parks in a handicapped zone? Scum, I'd say, but 127 tickets were written for "disability space" and 53 for "disability access zone."

Below the list, I have a few words about some of the charges after the table.

Ticket Charge

No Parking Permit

Out of Zone\Region

Courtesy Note - Vehicle Linked

Unauthorized Area

Patient Parking

Failure to display

Yellow Zone


Football Parking

No Permit Displayed

Disability Space

Improper Parking

Theft of Parking Services

DP with No Valid UGA Permit

Expired Permit

Beyond Time Limit

Disability Access Zone

No Meter Receipt Displayed

Fire Lane

No Overnight Parking

Improperly Displayed Receipt

Multiple Vehicles Parked

Alter/Falsify Permit

No Charge Entered

Obstructing Traffic

Expired Meter

Grand Total


There were 133 "football parking" tickets written. Most of these were on the various campus decks and, no surprise, on Football Saturdays in Athens, when campus gets crowded and the port-a-lets come out to play. More seem to come from the South Campus Deck than any other lot.


UGA Parking Tix -- It's all Black and White

In playing with the 2016-2017 academic year parking ticket data on the University of Georgia campus, I decided to look at the car color and number of tickets given. Because, of course, those red cars get more tickets, right? Nope. It's all black and white. Below is the ranking by color of cars given tickets last academic year.
  1. Black (6,118 tix)
  2. White (5,217 tix)
  3. Silver (4.594 tix)
  4. Blue (2,472 tix)
  5. Gold (628 tix)
I could go on and on with the rankings but why bother. This analysis is a perfect case study in how not to interpret the data. Sure, black cars get more tickets at UGA, about 23 percent of all written that year, but this is important -- we do not know the proportion of black cars out there compared to other colors, so we cannot say black cars are treated fairly, unfairly, or it's just plain coincidence. White, black, grey and silver cars make up an estimated 70 percent of all autos, according to one site I looked at, so making a big deal out of car color here would be a big mistake.

In much the same way, it's silly to note that Toyota brand cars get the most tickets, followed by Honda, Ford, and Nissan. There are simply more of those on the road or on campus, so you'd expect to get more tickets for those vehicles. If you're curious, silver Toyota got the most tickets of any brand/color combination (901 tickets). 

Oh, and a Ferrari got one ticket last year. Good. Thirty-five tickets were written for Hummers. Even better. BMWs got 813 tickets. Best of all, because no one likes people who drive BMWs. 

In the odd category, there was a car designated as "American Motors" that got one ticket. No idea what kind of car it was, a Pacer or what. Don't see much of those around any more.

On another day I'll look at how often people park in handicapped spaces, which is one of my pet peeves. Look at my previous posts for other breakdowns, first post here, second post here.

Tuesday, July 11, 2017

University Athletic Numbers

Thanks to this AJC story that pointed me to USA Today data on the revenues and expenses from university athletic departments. I dumped the data and created my own database to play around some. My first question was, how many schools have greater expenses than revenues? Let's take a look.

Proving it helps to spend money to make money, I suppose, there's a high correlation between expense and revenue. In fact, an almost perfect correlation (r = .99 for you statistical nerds out there). That's outrageously high. Anyway, below is the list of top ten in terms of revenue. In parentheses is their expenses ranking.

  1. Texas A&M (5)
  2. Texas (1)
  3. Ohio State (2)
  4. Alabama (4)
  5. Michigan (3)
  6. Oklahoma (9)
  7. LSU (12)
  8. Florida (14)
  9. Tennessee (8)
  10. Auburn (11)
Oh, by the way, Georgia is ranked 15th in revenue and 16th in expenses. 

As you can see above, the big spenders are also the big moneymakers, which is hardly surprising. Wisconsin and Penn State probably underperform, ranking 7th and 8th respectively in terms of expenses but only 11th and 12th, respectively, in terms of revenues. But still, a million dollars here, a million dollars there, what's the difference?

Texas A&M ranks #1 too on making more than it spends (revenue - expense), followed by Oklahoma, Florida, Arkansas, and West Virginia. Schools doing it backwards, more expenses than revenues? California had $21.7 million more in expenses than revenue, far more than #2 Washington State at $12.9 million (both PAC-12 schools, kinda interesting).

If you look at the original data on USA Today you'll see a "total allocated" column, which includes student fees and other monies transferred into the athletics program. You tend to see the highest numbers among mid-majors. James Madison has the most ($38.1 million), followed by Connecticut ($35.3 million). 

Finally, how do the conferences stack up? About as you'd expect. 

Revenue (rank in Expenses in parentheses)
  1. SEC (1)
  2. Big Ten (2)
  3. Pac-12 (3)
  4. Big 12 (4)
  5. ACC (5)
In other words, spend money, make money. Now the results above are based on the sum of all teams in a conference. If we change that to a conference's average the results are similar, the SEC and Big Ten lead, a shuffling below but the same conferences. In other words, how we do the math doesn't matter all that much.

Comparing Years for UGA Parking Tix

I wrote yesterday about parking ticket data. Let's compare 2015-16 academic year with the 2016-17 academic year on the most popular lots to see parking tickets. Why? Because it's my blog. I can do what I want.

Legion Pool
Legion Pool
Ramsey Center
Ramsey Center
E. River Road
Carlton St.
Ramsey Lower
West Campus Deck
North River Road
E. River Road
Driftmier Eng
Railroad Lot
Academic Achievement
Health Sciences
Carlton St.
W. Coliseum
Life Sciences Upper
Kappa Alpha
Health Sciences
North Hull

As you can see, the top two lots remain the same, but after that we get into something of a random shuffle that's probably not random at all but more due to construction near those lots or other factors I'm not aware of. Take West Campus Deck, for example. In the 2015-16 data it's way down the list, ranked 100th. Why, I don't know, but probably some student can tell me what caused this. The Railroad Lot is also high in 16-17 but it's 11th in 15-16, just off our Top Ten list, so no biggie. West Coliseum is high in 16-17 but was 24th in the previous year, perhaps due to construction around the nearby new science learning center.

So all in all, the lot lists are relatively consistent. They're the most popular and larger lots, hence they attract more illegal parking, and thus more tickets. For fun I dug back into my 2010-11 academic year data and, yes, the lots look more or less the same. Keep in mind some lots didn't exist back then or their names have changed or they're completely gone due to construction. Here's the Top Ten from 2010-2011:
  1. Ramsey Center
  2. Ramsey Lower
  3. Chi Psi House
  4. Carlton St.
  5. North River Road
  6. Rutherford Hall
  7. Railroad/Training
  8. Psychology Clinic
  9. Baxter Lumpkin
  10. E. River Road
Ramsey is always popular, and therefore draws students (and sometimes faculty and staff) with creative parking skills. The Psychology Clinic gets a lot of tickets for relatively few parking spaces, most of which are never used by patients but operate as a nice mousetrap for student parkers.