Monday, March 31, 2014

Research at Grady?

In my usual journalistic curiosity, I was going through the UGA press releases and decided to break them down by college. What does UGA send out in terms of research productivity?

To do this, I went here and on the right column clicked on various colleges and worked my way back, seeing which releases appeared to be about research. I basically went back about three pages for each college, to about the beginning of the Fall 2013 semester. Below, here are how many releases were about research, as in studies releases or published (not speakers, etc.):
  • Grady -- 1
  • Terry College of Business -- 5
  • Education -- 1
  • Family and Consumer Sciences -- 4
  • Ecology -- 9
and so on. I'm gonna stop here. What's this mean? Keep in mind the above is the first two or three pages of the most recent announcements, so it isn't affected by a college or school merely pumping out more stuff. In about three pages, regardless of school, you hit the beginning of Fall.

I admit this is hardly a systematic analysis, but it suggest here at Grady, where we do a lot of research, it's not getting much attention. Or at least UGA doesn't see it as sexy enough to earn press attention. The folks at the PR mothership are no doubt right, and research shouldn't be based on what will get you time on CNN or in the The New York Times, though let's face it, that never hurts.

All in all, I'm not sure what it means, other than some schools perhaps push out the research more, promote it more, or research the kinds of things that UGA feels it's better able to push onto the public and the press. Not laying blame here, other than perhaps we at Grady need to do a better job at doing quality research that does, to some degree, resonate with the public. Or at least with the flacks up the hill.

(yes, I said flack, and I'm happily a hack)

Sunday, March 30, 2014

Of Polls and Punditry

So there's this column in my local daily paper by Matt Towery, a Georgia pundit and pollster, a good enough guy -- so I'm told -- who writes the following in his warning to "most people" who may not fully understand what politicos and especially pollsters are up to.
And consider the fact that most of these D.C. bandits continue to carry out polls with 50 or 60 questions attached. In this day and time, almost no one would take the time to answer a 60-question survey. And these pollsters now claim that a good percentage of their surveys are answered by folks with cellphones. That’s a farce. When have you noticed a cellphone user intently listening to their phone and carefully answering 60 or so questions? It just doesn’t happen.
I added the bold face above because, frankly, it's bullshit. It's not unlike the argument when you hear a poll result that disagrees with your own position and you mutter: "Well, they never called me." How the hell does he know what people are talking about on their phones -- unless he's NSA. And why is he staring at people on their phones? And more to the point, how the hell does he know "it just doesn't happen?" Sorry dude, it happens all the time in serious, professional shops.

Towery's shop tends to use robo-polls, those computer-dialed, automatic phone calls that are quite popular because they're fast, they're cheap, and most importantly they're fast and they're cheap. That's okay as long as you label the method and note its weaknesses, in this case, skewing older in part because robo-calls cannot, usually, call cell phones. Talk about leaving out a chunk of humanity, especially young voters. I don't hate robo-polls, and with some sophisticated weighting of the data you can offset some of their drawbacks. That's if you know math and you know stats and even then, statistical weighting is part art and part science.

But look again at the boldfaced bullshit above. Given the pro shops that do cellphone surveys, such as the Pew Research Center, and given such polls tend to be more accurate than traditional landline live interviews and especially those done by robo-polling, I can only suspect there's just a little bit of professional jealousy going on here. My local paper should know better.

Again, robo-polls are fine. In many ways they're more nimble because you can put them in the field faster, get answers faster. But they are seriously flawed in terms of their sample, enough so that they traditionally underperform compared to talking to live people and, especially, including cell phones in the same.





Friday, March 28, 2014

What People Know ... in Japan

Political knowledge is apparently rarely studied in Japan, or so says this abstract of a new paper you may or may not be able to access depending on whether you're on a college campus or not. That's okay. I'm here for you.

The authors find that "in line with previous studies in the US context, that knowledge is explained by education, gender, and politically impinged employment as base factors, with interest, efficacy, and civic duty playing a role as second-stage behavioral factors." It's good to see basic results confirmed cross nations and cultures. The media stuff, though, is interesting. Soft news depresses knowledge, in this analysis, while print and even television traditional news improve political knowledge.  As the authors note:
At the same time, our findings lend credence to previous work that raises concerns about the ‘infotainization’ of Japanese (and US) news programming (e.g., Taniguchi, 2007; Prior, 2005). Rather than demystifying or democratizing Japanese politics, softer programs may simply be perpetuating extant gaps between elites and the public.
In other words, soft news may maintain or even increase the knowledge gap seen in the public -- even in Japan.

Thursday, March 27, 2014

India vs UK (in WWI?)

Remember World War I? Of course you don't, but you've read about it, seen movies about it (though not as many as the more popular World War II).I'm happy to report that Americans aren't the only history-challenged folks out there. According to this story, 1 in 4 Indians believe India was fighting the UK in The Great War (WWI). Or, as the story notes:
In reality, India was actually part of Britain when the war started in 1914 (2014 being the centenary of the great war).
 Oops.

Georgia Senate Race

A new poll is out on the GOP primary battle among a gaggle of Republican hopefuls. This new poll, like a previous one, has the David Perdue leading the pack.

Okay, that's the news lede. Let's look at the story itself, specifically how it reports the methodological details.

Now there are certain things you should always report in a poll story: the margin of error, how many surveyed, and how they were surveyed. It's that last one that gets messy. Telephone survey? Fine. But what kind of telephone survey? A robo-poll (those annoying push-a-button-to-answer things)? Were cell phones included in the sample? If not, why the hell not? Purists would also want other details included -- and I'm among them -- but let's stick to the basics here and look at the story. Here's the lede:
Voters will soon be heading to the polls to decide some major races, including who will replace retiring U.S. Sen. Saxby Chambliss.
Really, that's your lede?

Really, that isn't your lede, but this online version is written like you'd do it on local TV. The above sentence is fine as a lead-in for TV news (okay, it still kinda sucks, but we're talking local TV here), but it is inappropriate for a print version of the story, which is what we have.

The next two grafs are quotes from voters. Huh? They come out of nowhere. I'd fail this story if a student wrote it in my class. Well, it's clean otherwise. Maybe a "C."

Okay, but enough bitching about the writing. Let's get to the poll details.
  • 600 likely Republican voters
  • margin of error, 4 percent
  • um, and that's about it
What kind of poll? Robo-calls? If so, those do not include cell phones. That skews your sample older and more conservative and, frankly, makes it less useful. But I can't tell if it's a robo-poll. Those quotes don't help. They could have been at the end of a robo-poll where folks could say what they thought. Or it could be a traditional telephone poll and they just typed what the respondent said.

We don't know. And that's the problem.



Thursday, March 20, 2014

Mixing Modes

So there's this survey of Georgians (where I live) about candidates running for various offices. The lede is fairly straightforward, that David Perdue is ahead in the race for the GOP nomination for U.S. Senate.

I entitled this post Mixing Modes for a reason. At the bottom of the survey is this bit of methodological detail:
Cell-phone respondents and home-phone respondents included in this research: SurveyUSA interviewed 2,300 state of GA adults 03/16/14 through 03/18/14. Of the adults, 1,985 were registered to vote. Of the registered, 508 were determined by SurveyUSA to be likely to vote in the 05/20/14 Republican Primary, 443 were determined by SurveyUSA to be likely to vote in the 05/20/14 Democratic Primary. This research was conducted using blended sample, mixed mode. Respondents reachable on a home telephone (78% of registered voters) were interviewed on their home telephone in the recorded voice of a professional announcer. Respondents not reachable on a home telephone (22% of registered voters) were shown a questionnaire on their smartphone, tablet or other electronic device. 
I know, kinda long. I boldfaced the important part. When we talk about mixed modes we mean exactly that -- some people here got a phone survey, others got a questionnaire delivered to their mobile device. That's an interesting way to (cheaply) include non-landline phones in an automated poll, as you can't use a robo-poll on a cell phone, at least not legally.

But it's also two very different kinds of surveys. In one, a robot voice reads off questions and you push a number to answer. In the other, a text questionnaire appears. People respond in different ways, based on a survey mode. But let's stop with that and get to more interesting stuff for you political junkies.

If you go to the bottom of the linked page you'll see lots of breakdowns of the results by various socio-demographic and political factors. Useful stuff, especially here if you slide across to the column labeled Cell Phone/ Lan (as in landline). We don't see a lot of differences. There's a hint, though, of more likely Republican voters in the landline sample (79 percent) and more likely Democratic voters in the cell phone sample (73 percent). We'd expect that, and to be honest that difference may very well be within the margin of error.

Look at Paul Broun, my kooky congressman and U.S. Senate candidate. He gets 13 percent of the landline vote, 4 percent of the cell phone vote. Whenver you see a poll that does not include cell phones, beware. More conservative, older, more Republican respondents will be found on those robo-poll landline surveys.

The lesson here? Methodology matters, and the devil is not only always in the details, he often uses them to his advantage. So know them yourself.

Wednesday, March 19, 2014

Nate Silver's 538 Manifesto

Nate Silver's new-and-improved FiveThirtyEight has gotten a lot of attention, and rightfully so, as he takes personal brand journalism to the limit. As Poynter pointed out, his manifesto is longer than some, including his ESPN companion Bill Simmons.

Here's a butt-ugly world cloud summary of his manifesto. Click on this for a better look.

Wordle: Nate Silver's Manifesto on 538 
As you can tell, data and journalism get the attention, followed by news. The rest is just noise. Well, he's fascinated with Romney (8 mentions).

Really, read the manifesto yourself. It's worth the time, even at 3,551 words.

And he uses "journalism" 32 times, "data" (in some form or another" 49 times. Just a little context for the graphic above.

Friday, March 14, 2014

Academic Journal "Impact Factor"

The latest newsletter of AEJMC -- the mothership of academic organizations for those in journalism and masscomm -- includes an article (page 8) entitled:
Journal Impact Factors and Communcation Studies: A Report from the National Communication Association
You know it's important -- there's a colon in the title. So what's it mean?

Hard to say. This fuzzy article suggests, stunningly, that we should take care in the growing use of impact factors in evaluating academic journals. You can follow the link to read more about impact factors, but basically it's how often a journal is cited. That's the gold standard of research, by the way -- how often other scholars cite your work. The impact factor just cranks the data and feeds back to you a number, one ripe with measurement error and of course one that fails to account for the quality of a particular journal or study published in that journal.

(or, basically, you complain about the number if it doesn't score your work, or your journal, high enough)

The article itself, linked to above, suggests we need "extensive educational and outreach initiatives to educate members, administrators, and other interested parties about the nature and quality of the journal impact factor as a measure of journal quality, research quality, or research influence." Wow. That's a mouthful to say more or less nothing. Another warns we need to "guard against the misuse of journal impact factors and make public examples of such misuse when it occurs" (emphasis mine). I like this one, especially the part of making public examples of people. Academic stocks, maybe? Stoning? I'm not sure.

Okay, enough snark in what is meant to be a well-meaning, if academically overwritten, point, that influence factor scores are a bit misleading but the danger is, especially in promotion and tenure, we may lean on them more heavily than they deserve.That's the lede. That said, while I'm a journalism guy, I'm also a numbers guy, and I'm not terribly upset by the way this score is measured or its "impact" on decision making.

(er, unless you're talking about my journals or my work ...)

Now, all this said, I definitely check on how often my work gets cited. For example, a study I published in 1995 has been cited 95 times, according to Google Scholar. That's not half bad and, based on a quick scanning of colleagues in my college, better than most.

Thursday, March 13, 2014

Googling the SEC

Just for fun, I started doing a Google search on SEC schools to see what word pops up after the school name.So I teach at the University of Georgia. If I Google "University of Georgia" the next thing to pop up is:

football

Sigh. You kinda expected that, right? Or if not, you're hardly surprised. But how about some of the other SEC schools? Let's take a look, in no particular order:
  • University of Florida gators. Only one where nickname appears first.
  • University of Tennessee _____. Oddly Google offers nothing next. Just type Tennessee and it's Titans, the NFL team.
  • University of Alabama football
  • University of South Carolina ___. Nuthin. Interesting. Just South Carolina and it's earthquake, then football.
  • University of Kentucky basketball. Makes sense.
  • University of Arkansas ____. Again, nuthin. Just Arkansas, though, and it's Razorbacks.
  • University of Mississippi medical center. Friggin weird.
  • Auburn University ___. Now, if you just do Auburn, it's football.
  • Mississippi State University ____. Like Auburn above, leave off university and football appears.
  • Louisiana State University gets ___. LSU gets, of course, football.
  • University of Missouri gets, duh, -Columbia. In case you can't find it on a map, I suppose. No GPS in Missouri?
  • Texas A&M gets, yes, football. Surprised it's not Johnny Football.
  • Vanderbilt University gets ___, but just Vanderbilt gets football.
Now Google senses where you are and builds that into its search, so maybe because I'm in the South it grasps the near-religious nature of college football and feeds that back to me. Maybe. It really bases this on what other people have searched for, trying to fill in the blanks for you, so sports, especially football, clearly dominate. I think of all the schools, UF gets its nickname and no others do. I'm not sure what's up with the medical center thing in Mississippi unless it's amazing to find a hospital there.

How about outside the football-crazed SEC? Let's try a few random names:
  • Notre Dame football
  • University of Wisconsin Madison (for the geographically challenged). Just Wisconsin gets you basketball.
  • THE Ohio State University gets you, like Mississippi above, a medical center. OSU gets you football.
  • Rutgers gets football. No idea why.
  • University of Texas football. Kinda funny, given their recent season.
and so on. As you can see, people who search for universities tend to focus on the dominant sports teams.

Again, sigh.

Monday, March 10, 2014

UGA Bracketology

It's March, and that means brackets. Here, I've taken all the schools and colleges at UGA and tossed them into a bracket to see who wins. Below are the first round match-ups:
  • Social Work vs Pharmacy: Yes, the pharm folks have the drugs, mostly because they work for big companies while the social work folks work for next to nothing and for the better good. As crazy as the social worker types can be, it's hard to pull against them.
  • Engineering vs Law: A match-up of heavyweights. No one likes lawyers, kinda like no one likes Duke, so I'm betting on the engineers even if you'd get a better program at a school just down the road a bit.
  • Education vs Vet Med. Teachers vs puppies and kittens? No contest.
  •  Ag and Environmental Sciences vs Business. Another school (biz) that (like Duke) no one likes. I'm betting the crowd gets behind the CAES folks and they pull the upset. Plus they do actual research.
  • Environment & Design vs Public Health. It's hard to pick here, the folks who help make us better versus the folks who make nice things to look at.
  • Journalism & Mass Comm vs Law. No one ever pulls for the lawyers, but then again no one pulls for journalists either. Those with the money win, and that's not journalists.
  • Ecology vs Forestry. Talk about dream match-ups, the tree huggers and the folks with saws face off in a grudge match with druids everywhere eager for one outcome versus the other.
  • Arts & Sciences vs SPIA. Franklin A&S is so big and so fat and so comfortable that I smell an upset by the upstart SPIAnistas, who are hungry.
So, how it'd all turn out? Go here and see my final bracket and the ultimate winner.

Saturday, March 8, 2014

Grady at Wall Street

Hijacking my blog for a Grady PSA:
Four journalism students are participating in a fellowship of the Society of American Business Editors and Writers organized and sponsored by our Cox Institute.  Nicholas Fouriezos, Kathleen LaPorte, Cailin O'Brien and Maria Torres will spend their Spring Break on Wall Street visiting and training at The Wall Street Journal, Bloomberg News, Yahoo Finance, the New York Stock Exchange and Oppenheimer.  They were selected based on their performance in our advanced reporting course, which focused on business journalism in Fall and Spring 2013.  Keith Herndon, who taught the courses following his Reynolds Fellowship in Business Journalism, will accompany them on the trip.

Wednesday, March 5, 2014

Teaching Hospital vs. Theme Park Journalism

A popular approach to teaching journalism, one promoted by the Knight Foundation's Eric Newton, among others, is the teaching hospital method. Like medical school students, so goes the idea, students learn by doing. Not just doing it in a classroom, mind you, but doing it in real life. It's hard to argue with that.

Of course lots of j-schools do just this, through their student news operations or, like at UGA, through those independent of the university, in our case The Red & Black. I bring this up because here at Grady we're in the midst of an accelerated curriculum discussion prompted in part by a TV station downstairs that's running deep in the red. OnlineAthens has a good story today on the station (you can read the actual report here), which is likely to be sold and somehow we're going to create a new curriculum that will make use of that suddenly available space. As part of this, journalism will likely combine with broadcast news as a department, and curriculum.

Okay, fine. I have you up to speed as best I can on what's happening at Grady. Lemme push it a bit more.

I'm on two committees looking at this and someone came up with an interesting alternative to a teaching hospital approach, one I think deserves discussion. A teaching hospital can drive the curriculum, the tail wagging the dog, plus it's hard to staff 24/7, especially between semesters. Instead of a regional newsroom covering the basics, and training kids to work in a declining industry, we're looking at more entrepreneurial and innovative uses. The one I'm discussing here is the idea of a theme-based newsroom. A journalism theme park, if you will. So this year, we're going to focus on a single theme, say obesity or hunger or whatever. That becomes the "theme" of journalism students, of broadcast/TV students, of online stuff, of advertising students, of PR kids. It would reach across the curriculum, across the college, and reach across the University and involve students from, say, social work. Plus it would not require 24/7 staffing, not require students cover sewer board meetings (they'll do that in basic reporting classes). We'd use this as a capstone experience, or an upper-level set of courses. I dunno.

So, your thoughts?






Tuesday, March 4, 2014

Where UGA freshmen come from

I was messing with some data today. The map below shows you which high schools sent kids to UGA in the 2013 freshman class. You can click on the various schools to get the names, how many kids sent, and the school's rank.



Oh, a raw Excel file is here that lists the schools by rank. Data from UGA's factbook. I'm trying to do the same for 2003 and make a comparison, but the 2003 data is being difficult for some reason.

Saturday, March 1, 2014

Titular Colonicity -- JMCQ Style

I've written before about titular colonicity -- in part because the name tickles me, in part because it says something about academic research. Quite simply, the hypothesis argues that as an academic field matures, more research article titles include colons. Well, the latest issue of Journalism and Mass Communication Quarterly, the main journal in my field, came out this week and in it you'll find a good example of colonicity being very titular. Of the eight studies published, six include a colon. Of the two that foolishly forgot to use a colon, one has a question mark in the middle of the title, so we'll give it a pass.

In JMCQ's spring issue 10 years earlier, only six of 11 published studies included a colon. Ten years before that, six of 13 studies included the magic colon.

So is mass comm maturing as a field, as measured by colons in titles? I actually did a quick-and-fast-and-dirty and non-systematic study of colons and found that indeed their use has increased in our field. What that means I can't say, but one of these days I'm going to do a real analysis of colons in the major mass comm journals and submit the results to AEJMC, just for the hell of it and just to give some reviewer a reason to not accept a paper.

And just in case you think I'm making this crap up, here's an analysis of the so-called Dillion Hypothesis about colons in titles from studies published in the ecological sciences. As the abstract states:
In general, the results of this study support the Dillon Hypothesis of Titular Colonicity.

So there.