How Google’s AutoComplete Predictions Encouraged Transphobic Searches

0

Obviously, Google’s automated system has a habit of not detecting discriminatory suggestions. Rather than making more systemic changes, the company appears to have made quick changes to its algorithms, filtering out certain classes of search that violate its policies.

“While we automatically filter out many violations, we have identified an issue where, unfortunately, our systems were not working as expected,” a Google spokesperson told me in a statement three days after I initially requested comment. “We have removed these forecasts and associated forecasts as part of our policies.”

Google’s spokesperson said its system was designed to identify terms associated with “sensitive characteristics like gender,” but some search terms like “before” and “surgery” weren’t “recognized by those search engines.” classifiers as related to gender”. (As of this writing, terms such as “before surgery”, “sex at birth”, “is [X] about hormones” and “birth name” still appear in the suggestions of many trans celebrities.)

The Google spokesperson noted that implementing a universal fix for terms like “before” or “birth name” could make search suggestions less helpful when searching for cisgender celebrities.

The official statement from Google’s spokesperson went on to say that the company is “working to implement stronger automated protections” to address specific concerns like these, though it did not provide specifics. on how it would implement those protections or whether it planned to consult with the trans community in creating those protections.

But such reactionary solutions are bound to be fragile, says researcher Os Keyes, a Ph.D. candidate in the Department of Human-Centered Design and Engineering at the University of Washington.

“Precisely because Google relies on after-the-fact reports to reform its autocomplete system, which queries get blocked and when depends in part on the power and visibility of the affected population,” Keyes says. “If you are a small community and it is often rejected or overlooked in decision-making processes, like [trans people] are, you are at the back of the queue.

Google itself, as its spokesperson will be the first to remind you, did not specifically code transphobic results into its suggestions. The company’s past statements about problematic search suggestions tend to point out that automated results reflect what people are looking for.

On the one hand, it is true that Google is not responsible for the discriminatory discourse that its search results frequently reflect. “In a white supremacist, cisnormative, heterosexist, ableist, fatphobic, capitalist, and colonial society, people’s searches reflect all of these forms of structural and cultural inequalities,” explains scholar Sasha Costanza-Chockauthor of design justice and director of research at Algorithmic Justice League.

Share.

About Author

Comments are closed.