Sometimes even typing an innocuous search query into a search engine like Google can end up yielding a huge amount of results that could be sexually explicit in nature, like pornography. In 2019, an American actress named Natalie Morales tried to do a Google search for “latina teenage girl”, and it turned out that all the results she saw were pornographic even though it’s not about of a search term that would necessarily be used only in content of that type.
With all of this having been said and now out of place, it is important to note that google tries to reduce explicit search results by approximately 30% for different terms thanks to a new AI called BERT. This AI will attempt to determine whether or not someone is looking for explicit search results or if they just want regular ones, and search queries like the aforementioned “latina teen” as well as others like “dormitory d ‘university’ will mostly feature normal results. also search results.
One interesting thing to note here is that searching for “teenager” without additional descriptors would likely return normal search results that you’d expect, such as resources for teens with mental health issues, general articles on teenage life as well as many others. . Overly sexualized results for Latino teens can be harmful due to the fact that it’s the kind of thing that could potentially end up sexualizing an ethnic minority, those underage members of that minority who are still teenagers.
Google has been trying to improve its search results to show less explicit pages for some time now, such as causing search results for “Black girl” to stop being largely pornographic in 2013. The online giant technology is also working on other forms. AI like MUM which is more focused on suicide prevention and such, and these steps will definitely make Google a much better search engine with all things having been considered and accounted for.
Read next: Google addresses problems with the spread of misinformation by introducing new tools