Readers today are living in a world where they can search online and find whatever will confirm their views. Have you done it? Are you even aware of the fact that you have done it?
There are three types of bias that influence the social media ecosystem and make it vulnerable to both intentional and accidental misinformation – cognitive biases (bias in the brain), echo chambers (bias in society) and algorithmic bias (bias in the machine).
Algorithmic bias has been getting the most attention lately so I want to talk about that first. Algorithmic bias or algorithmic manipulation relies on personalisation technologies which rank content and personalise its presentation based on what links the user has clicked on in the past – the result is often that the cognitive and social biases of users are reinforced thus making them even more vulnerable to manipulation, also called a filter bubble.
The effect of algorithmic bias and its influence on search results can be demonstrated by the story of Dylann Roof who killed nine African-Americans while they were praying in church in 2015. When Roof searched Google for the terms ‘black on white crime’, it led him to misleading statistics and white supremacist propaganda, fuelling anger in Roof and leading him down a dangerous path. Although you might expect that trustworthy, reputable or authoritative websites would be positioned higher on the search results list, this is not necessarily the case. Furthermore, the way that Google’s algorithm works, Roof’s motivated confirmation bias (also known as case building), led Roof to find more and more misleading and racist information about the so-called epidemic of black murders of whites (selective attention or selective exposure), with little to counter the racist propaganda that dominated his search results. The outcome was deadly.
Often, similar examples have led Google to tweak their algorithm. Although Dylann Roof provides an extreme example, it is not isolated in terms of broader recognition of the potential harm and pitfalls of algorithmic manipulation. It is possible to see the potential for some young people (and adults!) to be led in unfortunate directions without some sense of the bias that is inherent not just in the machine (Google, YouTube, Twitter, the internet), but also in our own cognitive and confirmation biases (bias in the brain). Studies of networking platforms show that content that rouses emotion is commented on and shared most often and can even continue to shape people’s attitudes after it has been discredited because it produces a vivid emotional reaction and builds on existing narratives.
Confirmation bias is the tendency to notice or search out information that confirms what one already believes, or would like to believe, and to avoid or discount information that is contrary to one’s beliefs or preferences. Brexit results and the Trump victory are just two examples of how confirmation bias and incestuous amplification as a result of online media have led to unexpected results. Facebook and Twitter (as well as other social media platforms) are making steps toward limiting fake news, tagging content, suggesting content that helps users find new perspectives and tweaking their algorithms, but people continue to share fake stories.
“Knowledge and critical thinking skills possessed by parents and teachers are of significant relevance in the growth of such skills in young people.”
Greater digital and media literacy is needed for adults, not just children. Despite the concept of echo chambers being debunked by some studies (and also discussed here), the least skilled internet users are still most susceptible to fake news, filter bubbles and echo chambers online. Furthermore, a 2017 survey of 8 to 16 year old Australians found that most 8 to 12 year olds preferred their news to come from their family, television or a teacher/adult and 13 to 16 year olds preferred television, family or social media. For this reason, the knowledge and critical thinking skills possessed by parents and teachers are of significant relevance in the growth of such skills in young people.
With growing awareness about the dangers of digital wildfires in a hyperconnected world, recognising bias in the your own brain, bias in society and bias in the machine (and the interplay between biases) may play a powerful role in mitigating at least some of the power of misinformation in the modern world.