Tuesday, December 29, 2015

Starting from Scratch

UGA's journalism department begins its new curriculum this spring semester, and with it I have to, from scratch, create a large-lecture class we call Information Gathering. Basically it's a class in fact finding, not unlike the one taught forever at UF but updated as our former broadcast and journalism students are now all in one major with common classes.

This mission of the class is to encourage in students, to steal a phrase, a "documents state of mind."

Enough of the curriculum change. About this class -- which was my idea, so I shouldn't bitch -- starting from an outline and a rough idea in my head is tougher than I thought. Not the week-to-week topics. That's easy. I'm in the office today pulling together stuff at the daily level, creating slides and links and the like for some of the first few weeks. That's more difficult, in part because there is so much stuff out there. So very much.

For example, I'm frontloading the class with ethics and law. I firmly believe that should come first, not last, to journalism students. Making them take mass comm law at the end of their course of study treats it as an afterthought. I focus on the tension between law and ethics (what we legally can do versus what we ethically should do). In the first couple of weeks, for example, they'll get intimate with the Georgia open records/meetings law and they'll hook up with two or three major codes of ethics. Most of the law stuff is access. They'll take a mass comm law class in their second semester, so I'll go easy on libel, etc.

Here are some of my topics:

  • Ethical decision making
  • Four methods of gathering news: observation, interview, documents and data
  • Accessing public meetings, records, and events (a lot on this)
  • Verification and fact checking
  • Search techniques for online and search engines
  • Primary vs. secondary sources
  • Analyzing reports, such as budgets and audits
  • Interpreting police incident and arrest reports
  • Making sense of courts
  • Live news events
  • Social media
  • Understanding math and statistics in a news setting
  • Accessing and analyzing data for stories
  • Strengths and weaknesses of eyewitness accounts
  • Local government
  • Non-profits
  • and so on, and so on, and so on. I have about 120 students in the class, so real-world exercises will be a challenge, though I have some ideas on that.

And this is an abbreviated version of my list. Sheesh.

It'd also help that UGA had the air running in my building this week, but we're closed. Fricking hot in here.

Monday, December 28, 2015

End of 2015

As 2015 ends, time for the blog's year in review. That is, traffic and most popular posts and so on. Below is the traffic all time, really since my initial post of May 2, 2007. So far, there have been 1,656 posts.

As you can see above, the most popular posts are a few years ago and that 2015 was a humdrum year as traffic goes. Then again, blogs are soooo 2005, so that's hardly surprising. Those old, wildly popular posts are from the infamous The Red & Black walkout and three of my top five posts are about that as I supported the students against the board.

Some of my most popular 2015 posts included a judge's idiotic gag order in a local murder trial, an early peek at UGA pay raises, me bitching about UGA's new grading system, and of course the death of beloved JOUR3410.

Turning back to all-time stats, Google continues to dominate in terms of traffic sources, followed by Twitter and Facebook. The top search term that lands people here is "cognitive mobilization." Why? Because I did a couple of posts about it, published one study in the paradigm a million years ago, and it remains a popular research topic in Europe. Which of course raises the question of countries. Most traffic, unsurprisingly, comes from the U.S., followed by Russia, Germany, and Ukraine.

Tuesday, December 22, 2015

How Ya Ask It

It's the holidays. We all have better things to do. Still, I point to this very interesting piece that demonstrates it's all about what questions you ask when tapping political knowledge of whites versus ethnic/racial groups. Take a few minutes, check it out.

And Merry Christmas.

Thursday, December 17, 2015

Best College Town?

Everyone loves a list, and this one caught my eye today -- yet another list about the best colleges and, in this case, the best "college town."

At first glance the list seems routine. Here are the first five best college towns according to something called WalletHub:
  1. Ann Arbor, MI
  2. College Station, TX
  3. Iowa City, IA
  4. Provo, UT
  5. Gainesville, FL
Now we can all quibble with this top five. College Station, TX? Really? And Provo? What if you need coffee for studying?

But here's the kicker, at least for me. Atlanta, Georgia, is #7.

Atlanta is a lot of things, but one of them is not a college town. Just above Atlanta is Pittsburgh, a good enough city but not a college town, despite having some damn good colleges there.

To be a college town, the university must dominate the town. It must be the reason for the town's existence, or at least it's main business. When you think company town and that company is a college, then you have a college town.

Atlanta? Not even close.

Athens, by the way, the college town, finishes #16 (#9 among "small" towns). It's a top 10 college town, maybe top 5. Maybe #1 (though Madison and Gainesville and Chapel Hill and a few places all have strong arguments in their favor).

So how the hell does something like WalletHub come up with its list? Good question. It's time, children, to speak of methodology. Here's its methodology page, if you're so inclined and dweebish enough to dip into their, um, method. They use 23 metrics. Some of them are interesting, such as cost of pizza and burgers, but a lot of them have little validity to measuring what they think they're measuring, the "best college town." They have metrics for "best" but crime rate doesn't really apply to a college town.

Thursday, December 10, 2015

Thinking Out Loud (or in pixels)

Between grading for this semester and planning a new class for the next, I'm also mulling over my next research project. Here's me thinking out loud, or at least via blog pixels.

Here's what I'm fiddling with at the moment. I have national survey data on viewing of various television programs, 48 of them actually. So far I've categorized them as news (nine are on Fox News alone, but also major network programs) and entertainment.

Here's my hypothesis: we know the news audience has fragmented along partisan and ideological lines. Is this reflected in entertainment programming?

So what entertainment programs do I have? Stuff like Big Bang Theory, Insider, NCIS, American Idol (it's 2012 data), Dancing with the Stars, etc. Also a bunch of talk programs like The Talk and The View. Also late-night stuff, but that's a special case. As I said, I have 48 specific television programs in all. Before actually analyzing the data, I wonder if it's even an interesting hypothesis that there may, or may not be, partisan migration. I don't think there is. I doubt there's a study here, at least not in these specific programs.

(as an aside, I'm tempted to analyze the nine Fox News programs to tease out differences in the audience)

Sigh. I'd prefer to stick with my main research topic, the surprised electoral loser, but I've pretty much exhausted all my data on that question.

Friday, December 4, 2015

That Magical 30 Percent

There's been a long-running survey on the UGA campus. You can read the most recent story here. Here's my interest, in the next graf pulled from the story:
The survey ran from Oct. 20 to Nov. 20 with hopes of collecting anonymous response data from 30 percent of the university population, including students, faculty and staff. And while just 48 hours from the survey’s conclusion the survey was 1,000 respondents short of its goal, Michelle Cook, associate provost for Institutional Diversity, said she feels that the goal was met.

OK, let's unpack that graf a bit. Why 30 percent as a goal? For what reason? I know a thing or two about survey research and public opinion and for the life of me I can't find anything significant about 30 percent versus 25 percent, or 50 percent. Now 100 percent, that's a census.

Then again, it's a non-random sample, so the results are questionable anyway. Plus, if you know how, you can fill it out again and again. I know. I did. Three times. Data duh.

Finally, I love how we have this goal, but we're gonna be short, but someone "feels that the goal was met." That's not fluffy PRspeak, to be fair. She suspects there are outstanding surveys still to come in that will reach the magical, inexplicable goal of 30 percent.

WARNING UGA -- A 30 percent SLOP (self-selected poll) is meaningless. You cannot argue the sample is representative and therefore generalizable to the campus population you're trying to describe. Which is the purpose of the survey. You might try statistical weighting of the results, but I'm not sure you want to go there. Doable, but tricky. You don't know how many repeat surveys you have, but given the nature of the questions I'm fairly certain you have a number of them as folks insert their favorite social and political agendas into the data.

When the survey is "released" I may have to request the raw data just to check on their "interpretations" of the analyses. No doubt that'll piss off some suit somewhere up the hill.

That's my job.

Does this make me the anti-diversity asshat? No, and I'm not. What I dislike, though, are data-based decisions based on bad, skewed, questionable data just to fit someone's preconceived notions of policy changes they want to implement. It's called political cover. And it's bullshit. Just woman up and say these are the changes we want to make, we feel are in the university's best interest. Hell, it's how they do everything else at UGA, a place where they've even centralized when classes can be taught -- day and time -- and in what room.

I'm not the asshat here.

Wednesday, December 2, 2015


Since 1991 I've taught some version of UGA's introductory newswriting class (once called jour341, then jour3410 as we decided an extra digit was cool). That's 24 or so years teaching the class, especially the mass lecture and, usually, at least one of the writing labs.

No more.

R.I.P. Jour3410.

Moments ago I  gave the last exam of the last class of 3410. Why? Because we've rewritten our curriculum -- in a good way -- and we even killed all the old numbers so everything starts fresh in Spring 2016. We have a few students in the old curriculum and we'll finish them up in the old classes, but for the most part a new cohort starts in January.

But (you sputter) what about an intro to newswriting class? It still exists, but it's called Writing Across Platforms (TV, online, print, social media, etc.). Instead of lecture-lab format, students will work with a faculty member in smaller groups of 20 -- what we in the biz sometimes called "intact groups." Much better, in my mind, and the entire curriculum follows this "across platforms" mentality.

There's still a big lecture but that's a completely different class called Information Gathering, a fact-finding lecture that will focus on public meetings, public records, finding stuff out, verifying info such as photos and social media rumors, and a whole lot more. And yeah, I'm the first person teaching it this Spring. I think I've got it worked out. I'll write more about that class over break.

Tuesday, December 1, 2015

Science vs The Press

Science and journalism share certain traits. They are both disciplines of verification, for example, and they both often tell us uncomfortable truths about ourselves. They also often find themselves criticized by partisans, especially (but not exclusively) from the political right.

While messing with other General Social Survey cumulative data, I decided to briefly see how the press and science have fared over time in terms of, for lack of a better term, "consumer confidence." My hunch, based on criticism, was that both the press and science, broadly defined, would suffer more or less the same in terms of confidence. I was wrong.

The question is simple:
I am going to name some institutions in this country. As far as the people running these institutions are concerned, would you say you have a great deal of confidence, only some confidence, or hardly any confidence at all in them?
My graphic is the percentage of folks from 1973 to 2014 who answered "hardly any" confidence. As you can see, the press suffers serious erosion over time, while science suffers little erosion and the percentage of those with "hardly any" confidence remains in single digits. For the press it nears the halfway mark (44.8 percent in 2014). Scary.