How do people integrate or ignore new information based on their existing beliefs? Bayes' Rule. Or at least that's according to a study I'm reading that offers a mathematical formula to sum it all up:
P(A/B) = P(B/A) P(A) P(B)
That settles everything. Right? Wait, don't click that mouse!
Let's break it down, at least according to the article (abstract here). A = your existing belief, B = new information. The new belief, the stuff to the left of the equals sign above, is equal to the product of our prior belief and "the likelihood that B would occur if A was true."
Confused? Me too. Forget the math. Thankfully a graph or two later the authors tell us Bayes' Rule "does not provide a complete normative standard." Translation: it doesn't work all that well in the real world.
The study as a whole has to do with discomfirmation biases, how we selectively deal with information, which is obviously of interest to anyone studying political communication and how people make sense of their political world. Basically they find people cannot ignore their prior beliefs when dealing with new information. People see congruent arguments (ones that fit their own beliefs, called congruence bias) to be stronger -- even if they're not.
Some other interesting results: sophistication (often a function of political knowledge) can moderate disconfirmation bias (but not congruence bias). In other words, politically knowledgeable folks are a little less likely to disbelieve a message that is against their beliefs as compared to those who are less knowledgeable, but it plays no role in how people judged messages that agreed with their point of view as being stronger than it actually is.
I hope that made sense.
No comments:
Post a Comment