Uncovering Bias

Can you solve this?

Readers today are living in a world where they can search online and find whatever will confirm their views.  Have you done it?  Are you even aware of the fact that you have done it?

There are three types of bias that influence the social media ecosystem and make it vulnerable to both intentional and accidental misinformation – cognitive biases (bias in the brain), echo chambers (bias in society) and algorithmic bias (bias in the machine).

Algorithmic bias has been getting the most attention lately so I want to talk about that first.  Algorithmic bias or algorithmic manipulation relies on personalisation technologies which rank content and personalise its presentation based on what links the user has clicked on in the past – the result is often that the cognitive and social biases of users are reinforced thus making them even more vulnerable to manipulation, also called a filter bubble.

The effect of algorithmic bias and its influence on search results can be demonstrated by the story of Dylann Roof who killed nine African-Americans while they were praying in church in 2015.  When Roof searched Google for the terms ‘black on white crime’, it led him to misleading statistics and white supremacist propaganda, fuelling anger in Roof and leading him down a dangerous path.  Although you might expect that trustworthy, reputable or authoritative websites would be positioned higher on the search results list, this is not necessarily the case.  Furthermore, the way that Google’s algorithm works, Roof’s motivated confirmation bias (also known as case building), led Roof to find more and more misleading and racist information about the so-called epidemic of black murders of whites (selective attention or selective exposure), with little to counter the racist propaganda that dominated his search results.  The outcome was deadly.

Often, similar examples have led Google to tweak their algorithm.  Although Dylann Roof provides an extreme example, it is not isolated in terms of broader recognition of the potential harm and pitfalls of algorithmic manipulation.  It is possible to see the potential for some young people (and adults!) to be led in unfortunate directions without some sense of the bias that is inherent not just in the machine (Google, YouTube, Twitter, the internet), but also in our own cognitive and confirmation biases (bias in the brain).  Studies of networking platforms show that content that rouses emotion is commented on and shared most often and can even continue to shape people’s attitudes after it has been discredited because it produces a vivid emotional reaction and builds on existing narratives.

Confirmation bias is the tendency to notice or search out information that confirms what one already believes, or would like to believe, and to avoid or discount information that is contrary to one’s beliefs or preferences.  Brexit results and the Trump victory are just two examples of how confirmation bias and incestuous amplification as a result of online media have led to unexpected results.  Facebook and Twitter (as well as other social media platforms) are making steps toward limiting fake news, tagging content, suggesting content that helps users find new perspectives and tweaking their algorithms, but people continue to share fake stories.

“Knowledge and critical thinking skills possessed by parents and teachers are of significant relevance in the growth of such skills in young people.”

Greater digital and media literacy is needed for adults, not just children.  Despite the concept of echo chambers being debunked by some studies (and also discussed here), the least skilled internet users are still most susceptible to fake news, filter bubbles and echo chambers online.  Furthermore, a 2017 survey of 8 to 16 year old Australians found that most 8 to 12 year olds preferred their news to come from their family, television or a teacher/adult and 13 to 16 year olds preferred television, family or social media.  For this reason, the knowledge and critical thinking skills possessed by parents and teachers are of significant relevance in the growth of such skills in young people.

With growing awareness about the dangers of digital wildfires in a hyperconnected world, recognising bias in the your own brain, bias in society and bias in the machine (and the interplay between biases) may play a powerful role in mitigating at least some of the power of misinformation in the modern world.

2 Replies to “Uncovering Bias”

  1. How interesting it is to be reading this post in the light of today’s events in the Australian media landscape! Even though I am currently overseas, I understand that every newspaper in Australia ran a redacted cover today in protest of the Australian government’s attempts to curb media scrutiny. To me this an astonishing and rather sad state of affairs. I have been a very vocal critic of the Australian print media for some time, but I honestly don’t know if many of my friends know this. After recent changes to Facebook’s guidelines , I have been very conscious of shrinking returns on my political posts. When I post travel photos, for instance, there’s double-digit likes within a few hours, but when I critique a news article or repost a political cartoon or meme, I’m lucky to get four or five likes. When I first noticed this happening, during the Australian election, I was worried my political posts were alienating my friends. But I’ve since learnt that with Facebook’s filtering many of my friend’s probably never even see these posts.

    Of course I understand that not everyone shares my interest in politics, but commenting on and sharing news article is a way for me to bring media bias to the attention of friends who are less likely to follow politics on a daily bais. That’s why I find Facebook’s rather blunt response to the Cambridge Analytica scandal so infuriating. Recently, presidential hopeful, Elizabeth Warren paid to run a fake news story about Facebook CEO, Mark Zuckerberg, on his own platform to make the political point that Facebook were prepared to monetise content, but not to police it. The stunt worked.

    So, now, we have a rather perplexing scenario where ordinary people are unable to share political opinions online to their own networks, because of Facebook’s algorithmic filtering, but paid advertisers can – and (as Warren’s stunt proved) without any real oversight on Facebook’s behalf. It is noteworthy to mention that Warren believes Facebook should be broken up to improve competition in the tech sector.

    Although I’m often concerned about the digital literacy of older Australians, I completely agree with you that we need to educate our students to better filter online content. All of us, as internet users, need to be more active in our online reading habits, more critical of news that’s fed to us and more conscious of our personal biases. As a rule, when I engage with political situations, I try to consider the “stakes”. Who has the power? Who has the influence? Who stands to lose and to gain? That helps me personally navigate a lot of political situations. And personally I think it is better moral compass than a computerised algorithm.

  2. Thank you for your considered feedback, Matthew. It seems that it is a space that you are very familiar with. There’s certainly a lot happening! I love your comments in your final paragraph. Although I wrote in another blog post about the danger of a ‘question everything’ mentality, I think that the three questions you present are a valuable tool for evaluating media content. Changing the algorithm on any platform or website will never make media consumption free from bias. Critical media literacy for all media consumers is an important skill in the modern world.

Leave a Reply to Leanne Morgan Cancel reply

Your email address will not be published. Required fields are marked *