- News & Events
- News archive
- Summer 2022
- Spring 2022
- FACTS AND FALSEHOODS IN THE TIME OF COVID-19
- Celebrating Black History Month across the Faculty of Communication, Art and Technology
- SFU professor shares experience living and teaching in war-torn Ukraine
- SFU artists and researchers showcase art installation on Surrey’s ‘UrbanScreen’
- Leadership and Agile Production Management micro-credential established in partnership with DigiBC
- Leading with heart: Meet Staff Achievement Award winner Corbin Saleken
- HOW GOOGLE’S SEARCH ENGINE SUPPORTS CONSPIRACY THEORISTS AND HATE FIGURES
- SFU staffer’s commitment to local arts community nets staff achievement award
- Fall 2021
- Summer 2021
- Spring 2021
- Fall 2020
- Summer 2020
- Spring 2020
- Fall 2019
- Summer 2019
- Spring 2019
- Fall 2018
- Summer 2018
- Spring 2018
- Fall 2017
- Spring 2017
- Fall 2016
- Summer 2016
- Spring 2016
- Fall 2015
- Summer 2015
- Spring 2015
- Fall 2014
- Summer 2014
- Spring 2014
- Fall 2013
- News archive
- 2022 FCAT Undergraduate Conference
- 2021 FCAT Undergraduate Conference
- Cancelled: 2020 FCAT Undergraduate Conference
- 2019 FCAT Undergraduate Conference
- 2018 FCAT Undergraduate Conference
- 2017 FCAT Undergraduate Conference
- 2016 FCAT Undergraduate Conference
- 2015 FCAT Undergraduate Conference
- 2014 FCAT Undergraduate Conference
- 2013 FCAT Undergraduate Conference
- FCAT Research and Teaching Forum
- 2022 FCAT Undergraduate Conference
- Featured Student
- Featured alumnus
- Future students
- Current students
- Get involved
- Support FCAT
- Dean's External Advisory Board
- Work at FCAT
- FCAT Excellence Awards
- FCAT Connects
- FCAT research funding lifecycle
- FCAT research grants and awards calendar
- Institutes, centres, labs and projects
- Quick links and resources
- Return to campus
School of Communication Faculty of Communication, Art and Technology Publishing Programs
HOW GOOGLE’S SEARCH ENGINE SUPPORTS CONSPIRACY THEORISTS AND HATE FIGURES
Simon Fraser University (SFU) Communication Professor Ahmed Al-Rawi’s research examines the intersections between political extremism, misinformation and social media. He leads The Disinformation Project at SFU, which examines fake news discourses in Canadian news media and social media.
He also collaborates with SFU’s Digital Democracies Institute on the use of abusive language on social media. He is a frequent commentator in the news media, recently discussing how social media fuels support for the war in Ukraine while contributing to the spread of mis/disinformation.
One of Al-Rawi’s recent studies focused on the way artificial intelligence (AI) reproduces and promotes prejudice, hate and conspiracy online. For his recent article, How Google Autocomplete Algorithms about Conspiracy Theorists Mislead the Public, he collaborated with Postdoctoral Fellow Carmen Celestini, Master’s student Nathan Worku, and PhD candidate Nicole Stewart.
They looked at the subtitles that Google automatically suggested for 37 known conspiracy theorists and found that—in all cases—Google’s subtitle was never consistent with the actor’s conspiratorial behaviour.
For example, influential Sandy Hook school shooting denier and conspiracy theorist Alex Jones is listed as “American radio host” and Jerad Miller, a white nationalist responsible for a 2014 Las Vegas shooting, is listed as “American performer.”
Al-Rawi stresses that the perceived neutrality of algorithmic search engines like Google is deeply problematic. He argues that subtitling known conspiracists as neutral and not negative can mislead the public and amplify extremist views.
We met with Professor Al-Rawi to discuss his work.
Most internet users perceive Google as a neutral search engine. However, your article mentions some of the biases present in these algorithms. Please describe what is happening here.
Yes, this is exactly the point behind writing this paper. When I first looked at these algorithmically produced labels, I felt there was something very wrong with them, so I proposed following a reverse engineering method to explore further. These labels do not receive enough public scrutiny, unlike the case of Facebook and, to a lesser extent, Twitter. I think search engines are exacerbating the problem of disinformation not only because of these labels, but also due to the affordances they offer people in terms of easily searching for and finding disinformation.
If individuals are well known to be conspiracy theorists, why doesn’t Google identify them as such?
I think it is similar to the issue of social media sites that were very reluctant in the beginning to
de-platform controversial users because of the fear of alienating audiences and/or losing revenues.
What are your recommendations for Google? Are policy-makers paying attention?
Due to the increasing public and official pressure on social media companies, many recent changes happened that made them more active in moderating their sites. I hope the same thing will happen soon with Google's search features.
What are your recommendations for internet search engine users? How can we be more attuned to the inner workings of the internet?
I think we all need to be critical of our online surroundings, and I encourage everyone to search for other well-known controversial figures to see how Google has labeled them. I think we need more insight into what is known as the black boxes of algorithms, and the often-biased rules they follow. The same applies to better understanding social media platforms by following a similar procedure with regards to what hashtags or keywords are allowed or not on different sites.
We all need to be diligent with the content we read online because we cannot take what we view for granted. It is useful—and critical—to continue to question and challenge our sources of information about the issues we care about.
Read more about Al-Rawi's research on Google’s search engine algorithms in the Conversation Canada.
The Disinformation Project has been made possible in part by the Canadian Department of Heritage.