Thursday, February 28, 2013

Is Bob Woodward a Wuss?

It's all over the net, the flap between Bob Woodward and the White House and whether the renown journalist was threatened in an email.  There's a terrific summation at Politico, with the emails and embedded video.  Basically, here's the "threat" part:
...But I do truly believe you should rethink your comment about saying saying that Potus asking for revenues is moving the goal post. I know you may not believe this, but as a friend, I think you will regret staking out that claim.
That's a threat?  "As a friend, I think you will regret staking out that claim"? Really? Hell, as a reporter I had threats in late night phone calls, had threats in person, had a gun pointed at me.  Those are threats. 

Now let me be clear, I'm a journalism kid of the 1970s.  I'm a Woodward admirer.  "Woodstein" is why I'm here.  I'm not a partisan in this.

But here's my main point.

Seventy journalists were killed in 2012.  That's a record-breaking year.  For Woodward to call this email "a threat" while reporters are risking their lives in troubled spots around the world is, to me, too damned prissy.  

I hate that my hero appears to be a wuss.


Wednesday, February 27, 2013

Recorders and Processors

I was deep into a research article on how people process information and it struck me how much what they were talking about also fits some of the challenges in journalism. I'll avoid getting too PhDweebish, so here's how I'd translate what they were talking about into the roles journalists now face. 

The article drew a distinction between information recorders and information processors from a cognitive science perspective.  From a journalism perspective, think of information recorders as the basic stuff of news reporting: the taking of basic facts, interviews, and observations and then cobbling them together into a coherent but straightforward representation of what happened.  If you need a metaphor, think of it as taking flour, yeast, water, and other ingredients and baking a loaf of bread.  Information processors work at a higher level.  Some might call it a "value added" level, in which you take the work of the recorders and turn it into something similar.  We know these folks as aggregators (or if you're cynical, thieves). Or to extend the baking metaphor, they're taking the bread and making toast and jam, or a really tasty sandwich. 

As smarter people than I have long pointed out, it's become economically more challenging to be a recorder these days than a processor. Adding value, via opinion and color and snark and perspective and humor, makes the basic stuff of news much more interesting, more engaging.  Of course the recorders could do this too, and often have, but the constraints of traditional journalism have, until recently, made this difficult.

How does this affect the audience?

You read an awful lot about how journalists should do their jobs, and a lot of speculation about what it means to the audience, but damn little evidence -- hard date -- is presented to support many of the claims.  Why?  Because it's hard work, doing research.  Instead, toss in a few anecdotal interviews, a few gut feelings, maybe swipe a few Pew tables, type it all up and call it a major report on the future of the field.  As if real research was ever that simple.

Recorders out there, your job it to feed the spew.  Processors out there, you help make sense of it all in interesting (or partisan, or humorous) ways. From a theoretical standpoint, and this does get PhDweebish, I can say with some confidence that the end result in a further fragmented audience with less and less "common knowledge."  That's a problem for a democracy, or so the theorists tell us.  But that's a post for another day.

Tuesday, February 26, 2013

Fox News and Death Panels: A Love Story

Remember death panels?  Freshly published research in Journalism and Mass Communication Quarterly examines the predictors of the death panel misperception and I'm gonna point out one main result:

Fox News, you did it again.

The research, by Patrick Meirick at the University of Oklahoma, used national survey data from 2009 to establish that watching Fox News, not other media outlets, led to belief in the myth.  Hardly shocking, but here's the kicker.  It do so -- only among highly educated respondents.

Huh?  Education tends to be associated with greater political knowledge -- not belief in political myths.  Enter into the fray a little thing called motivated reasoning.  I've written in detail about this theory before, and I've used it in my own research (which, oddly best I can tell is not cited in this study ... tsk tsk). Essentially, the theory argues people believe what they want to believe, and the more they care about an outcome, the more they'll believe stuff that fits their predispositions, even if it's obviously bogus.  Biased processing run amok.

The study includes a big fat multiple regression, which makes my data-crunching heart happy, but if you're not a number nerd let me translate.  The author statistically controls for a number of factors you'd expect to lead to belief in the myth and the combo of watching Fox News and education pops out quite nicely. 

Skip to the next graph if you're not into methodological quibbles: I'd argue he fails to account for political interest, but you could counter-argue that's captured by his "follow health reform news" variable, and then I'd counter-counter-argue that no, it doesn't -- that's a global media variable, not a motivational one, and if you included political interest, education might very well disappear.  Yes, methodologists argue about stuff like this, which explains why we rarely get invited to parties.

If you dig into the table, you'll see other media variables basically play no role in this screwy belief.  Only reading newspapers is statistically significant, and it's in the opposite direction of watching Fox News.  In other words, newspaper reading reduced your likelihood to believe in the death panel myth, and it did so regardless of education level.  Whew.  Newspapers still rule.

Thursday, February 21, 2013

When Amateur (Catholics) Write Poll Stories

Hi.  My name's Barry and I'm a Catholic.

Wanted to get that out of the way before I criticize how this poll story is written.  I'm not even a recovering Catholic, but a go-to-Mass-every-Sunday Catholic, yet I can't let this poll story posted today on a Catholic site go untouched.

Okay, my religious butt covered, it's time for a little beat down of a fairly simple story from both a journalism and a public opinion perspective.

* The lede sucks.  It's too long and it starts with the wrong info (According to ...).  As I tell my reporting students, start with the WHAT.  If it's a poll, lead with the results.  And talk about vague.  This is your lede, that "many Americans have very strong feelings about illegal immigrants and immigration reform."  Zzzzzzzzz.

* Learn the language of polling if you're gonna write a poll story. The third graph is terrible. "The precision of the Reuters/Ipsos online poll is measured using a credibility interval. In this survey, the poll has a credibility interval of plus or minus 2.9 percentage points."  You mean margin of error.  People get that.  And you're not measuring credibility with this, but precision.  Yeah, it matters.


* The last two graphs, what the hell are they doing in the story?  Even if I sympathize with the point of view here, they don't belong.  To quote Steve Smith, an unknown and unimportant presidential candidate from four years ago, is ridiculous.

Tuesday, February 19, 2013

When Corrections Fail


An emerging group of studies examine why people cling to misconceptions and myths and whether fact checking can correct these beliefs.

In the latest study I came across, in Medical Health, the authors ran an experiment to see whether correcting Sarah Palin's infamous "death panel" myth would work.  The answer?  Not for everyone.  Here's the key graph:
The correction reduced belief in death panels and strong opposition to the reform bill among those who view Palin unfavorably and those who view her favorably but have low political knowledge. However, it backfired among politically knowledgeable Palin supporters, who were more likely to believe in death panels and to strongly oppose reform if they received the correction.
Why would more knowledgeable Palin supporters ignore the correction and cling to the myth, especially as less knowledgeable supporters became more accurate?  I can't read the entire study, just the abstract, but I suspect the more knowledgeable supporters were also the more partisan ones.  Simply put, it's harder to change those minds and, indeed, it's likely the correction attempt (as other research has ironically found) actually pushed people to believe Palin even more than they originally did.  And this is scary.


Of Food Banks and Surveys

I often point out surveys sponsored by special interest groups that happen to find people don't know enough about, coincidentally, that special interest's, um, interest.  Beware such self-serving polls, even if they're for a good cause, like this one out of Texas by food banks that finds 1-out-of-2 north Texans know someone who has used a food bank.

Again, good cause. It even has a pdf with more graphical display of data. I'm surprised that 80 percent of respondents say they've contributed in some way to a food bank.  That seems high, but if true -- all the better.

Now, to methodology.  They report:
There were more than 1,600 respondents to the survey, which was conducted in January 2013. The survey was conducted via an opt-in email targeting more than 60,000 North Texans, as well as via Facebook. 
You can't tell a lot from this, other than it's a hefty sample size.  What I'm concerned about, obviously, is how well these 60,000 folks represent the population as a whole.  It's unclear where the 60,000 names came from.  If from the food bank's own list, or lists they have access to, then you have to worry about the generalizability of the results.  There simply isn't enough methodological information to make a judgment -- and when that happens, you're spidey sense should go off.


Tuesday, February 12, 2013

Bad Surveys

A brief story gets into some rather inane comments by North Carolina's governor about higher education.  No surprise there.  The crux of the five-graph piece, though, is its reporting of a survey of faculty, who generally disagree with Gov. Pat McCrory.  Yeah, his comments are kinda dumb, but I'm more interested here in the survey.  Below, the methodology:
The online poll was conducted Feb. 6-7 and drew 172 faculty responses. The margin of error was plus or minus 6.8 percentage points.
Releasing a survey of a mere 172 respondents is a bad idea, and it's impossible to say whether these 172 souls were randomly drawn or whether this is a SLOP.  If the latter, it's even more useless.

Keep in mind, the results make sense.  I'd expect faculty to disagree with McCrory on the issue.  But there's a difference between getting what you expect and doing it right (and still getting what you expect).

Read more here: http://projects.newsobserver.com/node/26949#storylink=cpy
 
Ok


Thursday, February 7, 2013

Does Grady Want the R&B?

Sorry, this blog is usually about how people learn from the media or public opinion research.  I'm hijacking it -- briefly -- to discuss a couple of posts I read about the place where I work, Grady College at UGA, and the independent student newspaper, The Red & Black.

I don't want to rehash the 2012 spat between the R&B's board and student editors.  That's done, settled, and remarkable progress has been made since.  But I did read a couple of posts recently that caught my eye, especially one that suggests, well, read the January 6 Facebook post below:
Of course the J-school would LOVE to take the Red and Black--they have built quite a nice little nest egg to ensure their independence. But if anyone thinks that bunch of empty suits in the J-school can do a better job, they should look over their shoulders first. 
What a complete load of bullshit.  

Okay, maybe not the "empty suits" thing (though I never wear a suit, so "empty sweatshirt" instead?), but we'd LOVE to take the paper?  Sorry, the faculty agreed some time ago that's the last thing we want, the last thing we'd approve, and frankly we've avoided formal relationships with the paper through our classes.  Informal relationships have worked fine for decades.  It's all about collaboration now, but we do a pretty good job of collaborating without owning the thing.

Certain Grady faculty were quietly approached in summer 2012 about a more formal relationship.  I even heard a rumor that the University was also approached.  Pass, and pass.

The R&B has made exciting changes to its board of directors, it's produced some solid news stories and packages this year, and I'm hopeful for the future despite all the challenges in circulation and advertising.  I'm just an "empty suit" watching from the sidelines.  I chat often with student reporters and editors, but I've never once -- not in 21 friggin years -- been asked by the publisher or board about my thoughts on their strategy or approach.  And that's fine by me.  I'm not a management guy, I'm a reporting guy.  Yeah, I've said for years there are certain things the R&B is doing wrong strategically, but so has every other paper in America.

Simply put, short of some financial disaster at the paper, I never see us likely to "take" the R&B

Finally, as some of you know we're in the middle of a dean search at Grady.  Four candidates, all of them good ones, will visit this month.  We have a new department head coming in the Department of Journalism this Fall.  And there's been talk for quite some time now, very serious talk, that the broadcast news folks will merge with Journalism as a single department (as well as other changes, not important to get into here).  That's a cool idea, full of potential.  One aspect of this could very well affect the R&B.  Stick with me.  Newsource already exists in broadcast news, the TV program put out by students with a solid online and social media presence.  I can see a combined department taking and building on that, creating a true multimedia platform so journalism students can create all kinds of interesting news stories.  How does this effect the R&B?  It's a potential competitor, not for advertising dollars mind you, but for something far more important -- attention.  We live in an attention economy.  If you don't get that, you've fallen far behind.  Assuming a bunch of "empty suits" can lead a bunch of much smarter students, I see a multimedia site and a buffet of mobile apps that may very well threaten the R&B.  I don't like that.  I don't want to see that.  But it may also be possible that through collaboration, the paper becomes part of this approach.  Again, all of this is still up in the air.

Just throwing this out there to anyone who may actually give a damn -- and who can have an informed discussion on the major changes and how to turn them in the right direction.

Next post -- some PhDweebish research thing.  And maybe I'll go buy a suit.





 






Tuesday, February 5, 2013

Public Opinion about Public Opinion

That great philosopher, Lucifer, once said:

   Public opinion is nothing more than this,
   what people think that other people think.

Okay, Victorian playwright Alfred Austin actually put these words in the mouth of his title character, Prince Lucifer, in his 1891 play.

(yes I've mentioned this before, back in 2008)

What I'm discussing here is part of a larger work I'm fiddling with, the idea that we don't spend enough time studying people's opinions about public opinion -- or, for lack of a better word, meta-opinion.  There are theoretical traditions where this is more common (Spiral of Silence comes immediately to mind).  These traditions tend to view public opinion from a social control perspective.  Two factors come into play here: (1) we're always sensing the climate of opinion around us, and (2) a fear of isolation leads us to go with the majority, or at least not express our minority viewpoints as readily as we might.

Psychology has examined this in detail at the micro, individual level, or small-group level (my favorite, false consensus, but see also pluralistic ignorance for a more macro and contradictory approach). 

My point is related to, but separate from, what people think about public opinion polls.  In a sense, people view such polls negatively when the results disagree with their own positions, an example of the hostile media effect.  It's vital we tease out the difference between how people respond to a poll -- and who published it -- and how they think about public opinion in a broader sense.  In the latter, I'm talking about their acceptance (or rejection) of the idea of public opinion, their acceptance that other opinions have merit, their acceptance of whether such opinions should guide policy.  How do people define public opinion?  And has this definition changed in a world that has moved from mainstream few-to-many media to social media in which we have an almost instantaneous access to, as our friend Lucifer said above, "what other people think?"  I'm guessing social media, From Facebook to Twitter, have changed our ideas of what truly is public opinion.  And I'm guessing it's become something more narrow, more in tune with our "friends" and "followers."

And it's entirely possible I'm completely wrong.

This is me thinking aloud as I work on a longer, more thorough examination of the topic, hopefully with actual data gathered (or found elsewhere) to fully explore whether people think of public opinion today in a very different way than they used to -- and what that means for public policy, for elections, and ultimately for a healthy democracy.