Google autocomplete helps legitimize conspiracy theorists, study says

BURNABY, British Columbia — Google’s autocomplete feature attempts to make internet browsing and the retrieval of accurate information easier, but is this supposedly convenient attribute helping fuel conspiracy theories and mislead the public? Unfortunately, the answer to that question is yes, according to researchers from Simon Fraser University.

Study authors report that Google’s autocomplete algorithms often place oversimplified, innocuous subtitles on prominent conspiracy theorists. For example, while classifying Alex Jones as an “American radio host” may technically be true, it’s leaving out a pretty major portion of the story. Researchers argue this trend may seem minor in scope at first but may be misleading countless internet users and even helping amplify extremist views.

Another example is Gavin McInnes, the creator of the neo-fascist Proud Boys organization. Officials consider the group a terrorist entity in Canada and a hate group in the United States, yet Google’s algorithm displays a subtitle for Mr. McInnes that reads “Canadian writer.”

Jerad Miller killed multiple people in a 2014 Las Vegas shooting. Google’s algorithm says he was an “American performer.”

In collaboration with The Disinformation Project at the School of Communication at SFU, the research team analyzed the automatic subtitles displayed by Google for 37 alleged conspiracy theorists. They found “in all cases, Google’s subtitle was never consistent with the actor’s conspiratorial behavior.”

No way to change Google’s algorithms?

We’re not just talking about one website or even social media platform here. Google is synonymous with the internet itself at this point. Considering the sheer volume of daily traffic seen on Google’s server, study authors worry the subtitles “can pose a threat by normalizing individuals who spread conspiracy theories, sow dissension and distrust in institutions and cause harm to minority groups and vulnerable individuals,” according to Nicole Stewart, a communication instructor of communication and PhD student on The Disinformation Project.

For what it’s worth, according to Google, a series of complex algorithms automatically generate those subtitles. In other words, the search engine itself can’t accept or create custom subtitles.

Researchers explain that these subtitles are universally either neutral or positive – but never negative, even when it would be appropriate.

“Users’ preferences and understanding of information can be manipulated upon their trust in Google search results, thus allowing these labels to be widely accepted instead of providing a full picture of the harm their ideologies and belief cause,” says Nathan Worku, a Master’s student on The Disinformation Project, in a university release.

This study focused specifically on conspiracy theorists, but study authors say similar results appear when searching for widely known terrorists or mass murderers.

“This study highlights the urgent need for Google to review the subtitles attributed to conspiracy theorists, terrorists, and mass murderers, to better inform the public about the negative nature of these actors, rather than always labelling them in neutral or positive ways,” researchers conclude.

The study is published in M/C Journal.

YouTube video

Leave a Reply

Your email address will not be published. Required fields are marked *