Thursday, September 20, 2012

Why balanced discussions fail

Sep 20, 2012


By Cass R. Sunstein

IT IS well known that when like-minded people get together, they tend to end up thinking a more extreme version of what they thought before they started to talk. The same kind of echo-chamber effect can happen as people get news from various media. Liberals reading left-of-centre blogs may well end up embracing liberal talking points even more firmly; conservative fans may well react in similar fashion on the right.

The result can be a situation in which beliefs do not merely harden but migrate towards the extreme ends of the political spectrum. As current events in the Middle East demonstrate, discussions among like-minded people can ultimately produce violence.

The remedy for easing such polarisation, here and abroad, may seem straightforward: Provide balanced information to people of all sides. Surely, we might speculate, such information will correct falsehoods and promote mutual understanding. This, of course, has been a hope of countless dedicated journalists and public officials.

Unfortunately, evidence suggests that balanced presentations in which competing arguments or positions are laid out side by side may not help. At least when people begin with firmly held convictions, such an approach is likely to increase polarisation rather than reduce it.

Indeed, that is what a number of academic studies done over the last three decades have found. Such studies typically proceed in three stages. First, the experimenters assemble a group of people who have clear views on some controversial issue (such as capital punishment). Second, the study subjects are provided with plausible arguments on both sides of the issue. And finally, the researchers test how attitudes have shifted as a result of exposure to balanced presentations.

You might expect that people's views would soften and that divisions between groups would get smaller. That is not what usually happens. On the contrary, people's original beliefs tend to harden and the original divisions typically get bigger. Balanced presentations can fuel unbalanced views.

What explains this? The answer is called "biased assimilation", which means that people assimilate new information in a selective fashion. When people get information that supports what they initially thought, they give it considerable weight. When they get information that undermines their initial beliefs, they tend to dismiss it.

In this light, it is understandable that when people begin with opposing initial beliefs on, say, the death penalty, balanced information can heighten their initial disagreement. Those who tend to favour capital punishment credit the information that supports their original view and dismiss the opposing information. The same happens on the other side. As a result, divisions widen.

This natural human tendency explains why it is so hard to dislodge false rumours and factual errors. Corrections can even be self-defeating, leading people to stronger commitment to their erroneous beliefs.

The news here is not encouraging. In the face of entrenched social divisions, there is a risk that presentations that carefully explore both sides will be counterproductive. And when a group, responding to false information, becomes more strident, efforts to correct the record may make things worse.

Can anything be done? There is no simple term for the answer, so let us make one up: surprising validators. People tend to dismiss information that would falsify their convictions. But they may reconsider if the information comes from a source they cannot dismiss. People are most likely to find a source credible if they closely identify with it or begin in essential agreement with it.

In such cases, their reaction is not, "how predictable and uninformative that someone like that would think something so evil and foolish" but instead, "if someone like that disagrees with me, maybe I had better rethink".

Our initial convictions are more apt to be shaken if it is not easy to dismiss the source as biased, confused, self-interested or simply mistaken. This is one reason that seemingly irrelevant characteristics like appearance, or taste in food and drink, can have a big impact on credibility. Such characteristics can suggest that the validators are in fact surprising - that they are "like" the people to whom they are speaking.

It follows that turncoats, real or apparent, can be immensely persuasive. If civil rights leaders oppose affirmative action, people are more likely to change their views. Here, then, is a lesson for all those who provide information. What matters most may be not what is said, but who, exactly, is saying it.

The writer is a Harvard law professor.

THE NEW YORK TIMES

[This has bearing on the Singapore Conversation. And the prognosis is not good. People are likely to come to the conversation with their views and leave with a stronger conviction of the correctness of their position.]

No comments: