Wednesday, November 27, 2013

ST: The art of disagreeing – it can yield some good

Most people can recall times when individuals maintain the same position despite being given new information that seems to contradict it. Sometimes, people may even claim that the contrary information presented supports their position and so strengthens their belief.

The more important the topic, the more it affects interpersonal relationships negatively when there is strong disagreement. It gets worse when the issues seem clear and the individuals concerned are intelligent. In such cases, a lack of understanding cannot be the problem. This resistance to change is part of human psychology. It applies to everyone regardless of educational background, socio-economic status, political belief and moral position.

But if the underlying psychology is understood, this knowledge can facilitate personal and workplace relationships. It can also help address disagreements between policymakers and citizens or advocacy groups.

Confirmatory bias
Everyone tends to seek out, interpret and remember information that confirms existing beliefs, positions or actions. Pyschologists call it confirmatory bias, a term coined by Peter Wason. In the 1970s, Professor Wason published a series of seminal studies which showed that when asked to test a simple rule, people consistently sought information that would confirm the rule and ignored information that would disprove it.

In 1979, a research team from Stanford University published a study involving participants who were either for or against capital punishment. All participants were asked to read and evaluate two studies, one supporting capital punishment and the other undermining it. The results showed that participants rated the study that was pre-existing position as superior to the one that contradicted it. In reality, however, both studies employed the same methodology and were fictitious.

Subsequent research has shown that people express more interest and spend more time on information that is consistent with their pre-existing positions or actions. They also give more weight to and recall better such consistent information.

Confirmation bias also occurs in real life. People go about their respective routines paying more attention to news outlets and commentary articles, in mainstream of social media, which reflect their own political views. They also spend more time discussing and analyzing issues with those who share their views than with those who don't.

Does confirmatory bias occur when there is strong disagreement over an issue between policy makers and citizens or advocacy groups?
There are several clues to look for.

Look at the type of information that each side seeks out as relevant evidence for the debate. Look at which aspects of the  issue they attend to. Look at how they differ when interpreting and making sense of what the same data means. And look at what each side recalls when citing previous cases they consider similar or relevant to the current issue.

Policymakers tend to believe that policy successes are largely due to their acumen, and that policy failures are largely due to changes in external conditions beyond their control.

Citizens tend to believe that policy successes are largely due to luck, public cooperation or resources available to policymakers. They see policy failures as being largely due to incompetence or an inability to plan ahead.

An example is the way in which strains on Singapore's infrastructure and overcrowding in public transport in recent years yield different reactions.

Citizens tend to attribute such overcrowding to the lack of planning on the part of policymakers and this overemphasis on population growth. Policymakers, however, attribute the problem to unexpected changes in economic conditions. They also cite the growth in the number of foreigners, and the many years needed for building infrastructure projects.

Reacting constructively
Is there a way to react constructively to information that is contrary to firmly held opinions? There are no magic bullets, but here are some possible approaches to prevent negative effects and promote positive ones.
•In all major decisions, make a serious effort to question or at least revisit one's assumptions.
•Seek information that undermines and not just confirms pre-existing beliefs and positions.
•In group discussions, do not express a position before hearing from the other members. This allows alternative, and possibly better, ideas to surface more easily. This is especially important when leading a group, or if group members have a similar profile.
•Spend less time getting the views and listening to the justifications provided by like-minded individuals. Create more opportunities to listen to the views of those who may not agree. Allow them to elaborate and prove their point.
•Try to understand the position and frame of reference of those holding contrary views. Consider how they feel, their concerns and aspirations. Never trivialize their emotions.
•Focus on the substance of the contrary information and the situations leading to the disagreement. Avoid focusing on the motivations of those involved.
•Consider in what ways a strongly held belief might be wrong. Consider the consequences that might occur if it is and if the contrary view is right.
•When there is disagreement, consider whether it is a trade-off situation, a balancing act or a case of different but complementary approaches that can be integrated to achieve common goals.

How views get polarized
Confirmation bias becomes troubling when policymakers and citizens acummulate facts selectively, making them highly resistant to alternative views.
This often happens to like-minded and close-knit members in a group. The group may be an online community commenting on social and political issues or advocacy group pursuing a common cause. 


No comments:

Post a Comment