Language Matters When Googling Controversial People


One of the useful features of search engines like Google is the auto-complete feature which allows users to find quick answers to their questions or queries. However, auto-complete search functions rely on ambiguous algorithms that have been widely criticized because they often provide biased and racist results.

The ambiguity of these algorithms stems from the fact that most of us know very little about them – which has led some to call them “black boxes.” Search engines and social media platforms do not offer meaningful insights or details into the nature of the algorithms they use. As users, we have the right to know the criteria used to produce search results and how they are personalized for individual users, including how people are tagged by search engine algorithms. Google.

Safiya Noble, author of Oppression algorithmsexplores biases in algorithms.

To do this, we can use a process of reverse engineering, carrying out several online searches on a specific platform to better understand the rules in place. For example, the hashtag #fentanyl can currently be searched and used on Twitter, but is not permitted to be used on Instagramindicating the type of rules available on each platform.

Automated information

When searching for celebrities using Google, there is often a brief caption and thumbnail image associated with the person that is automatically generated by Google.

Our recent research has shown how Google’s search engine normalizes conspiracy theorists, hateful personalities and other controversial people by offering neutral and sometimes even positive captions. We have used Virtual Private Networks (VPNs) to conceal our locations and hide our browsing histories to ensure that search results are not based on our geographic location or search histories.

We found, for example, that Alex Jones, “contemporary America’s most prolific conspiracy theorist“, is defined as an “American radio host”, while David Icke, who is also known to spread conspiracies, is described as a “former footballer”. These terms are considered by Google to be the defining characteristics of these people and may mislead the public.

Dynamic descriptors

In the short time since our search was conducted in the fall of 2021, the search results appear to have changed.

I’ve found that some of the subtitles we originally identified have been changed, removed, or replaced. For example, Norwegian terrorist Anders Breivik was captioned “Convicted Criminal”, but now there is no tag associated with it.

Faith Goldy, the far-right white Canadian nationalist who was banned from Facebook for spreading hate speech, had no subtitle. Now, however, his new Google subtitle is “Canadian Commentator”.

There is no indication of what any commentator is suggesting. The same observation is found for American white supremacist Richard B. Spencer. Spencer didn’t have a label a few months ago, but is now an “American publisher,” which certainly doesn’t characterize his heritage.

Another change concerns Lauren Southern, a Canadian far-right memberwho was labeled as a “Canadian activist”, a rather positive term, but who is now described as a “Canadian author”.

The seemingly random caption changes show that algorithmic black box programming is not static, but changes based on several indicators still unknown to us.

Search in Arabic or English

A second important new finding from our research relates to differences in subtitle results depending on the selected search language. I speak and read Arabic, so I changed the language setting and searched for the same numbers to understand how they are described in Arabic.

To my surprise, I found several major differences between English and Arabic. Again, there was nothing negative in the description of some of the numbers I researched. Alex Jones becomes a “talk show TV host” and Lauren Southern is mistakenly described as a “politician”.

And there is much more to Arabic language searches: Faith Goldy becomes an “expert”, David Icke transforms from a “former footballer” into an “author” and Jake Angeli, the “QAnon Shamanbecomes an “actor” in Arabic and an “American activist” in English.

When the search parameter language changes from English (left) to Arabic (right), searches for Faith Goldy show different results.
(Ahmed Al Rawi), Author provided

Richard B. Spencer becomes “publisher” and Dan Bongino, a conspirator permanently banned from YouTube, goes from an “American radio host” in English to a “politician” in Arabic. Interesting way, the far-right figure, Tommy Robinson, is described as an “Anglo-British political activist” in English but does not have an Arabic subtitle.

Misleading labels

What can be deduced from these linguistic differences is that these descriptors are insufficient, because they condense the description to one or a few words which can be misleading.

Understanding how algorithms work is important, especially since misinformation and mistrust are on the rise and conspiracy theories are still spreading rapidly. We also need more information about how Google and other search engines work – it’s important to hold these companies accountable for their biased and ambiguous algorithms.


About Author

Comments are closed.