In this reflection, I would like to introduce you to something called the ‘Dunning-Kruger effect’, in the hope that it might reduce something I have noticed in various debates.
Many of the debates that have occurred across the media – on radio and TV, in print, and of course on social media – and subsequently in our own everyday conversations, could be categorised as less than friendly. You might think about the way climate change denialists have been lampooned on social media, as have those who are campaigning to raise awareness of our impact on the earth. Possibly you might think about the debate leading up to the same-sex marriage postal vote, and how many people on both sides behaved poorly. Within the context of the religious domain, I have heard people talk about how ‘others’ want to take away our freedoms, while these ‘others’ are also often vilified.
I suspect the tendency to very quickly retreat into our bunkers, and to start actively protecting ourselves by attacking ‘the opposition’ is driven, at least in part, by the way people feel so certain about their positions – such that if anyone disagrees, they must be fundamentally wrong.
Enter the work of Dunning and Kruger, who noted that in a given task people with less skill or knowledge would overestimate their competency, whereas those with a higher skill would more accurately assess their own knowledge. Thus, the ‘Dunning-Kruger effect’ is a type of cognitive bias in which people believe that they are smarter and more capable than they really are in a given area, and as such are unable to recognise their own incompetence.
In the climate change debate, which is driven by numerous factors, some scientific and some ideological, we can see this cognitive bias at work. Many of the climate scientists, particularly in the early years of awareness raising, acknowledged the limits of their knowledge. Their appropriately careful language inadvertently led those with less knowledge to identify what they saw as ‘gaps’ in the scientists’ research and this, together with their inability to recognise their lacking expertise, armed them with an unwarranted certainty.
A similar cognitive bias has been identified as one of the factors that led to the second invasion of Iraq, in which careful intelligence gathering led to wrong conclusions, in part because leaders did not have the capacity to unpack and understand the nuances.
I think this kind of cognitive bias is relevant for us and our own mindsets and conversations. Before we dehumanise or dismiss those who disagree with us, perhaps we would be well advised to explore how much we truly know about the topic at hand, as we meaningfully endeavour to consider the facts and implications, as well as see the other’s point of view, before we launch ourselves into a war.