Google's Misogynistic Autocomplete Suggestions: Who's Responsible?


Google’s autocomplete function is a complicated matter. Scratch that. The Internet’s content as a whole is a complicated matter. Or is it the world we live in that’s complicated? Actually, it’s a little of all three. And nowhere is this more apparent than in a recent campaign by UN Women, which demonstrates perceptions held across the globe about what women should and should not do, using Google autocomplete as the catalyst. 

un-women-ad-women-should


But what’s Google to do? Censor free speech? Well, that’s tricky. People will always hold strong beliefs offline, which will be reflected in the content that lives online. And Google isn’t going to change that without first censoring free speech.

What about suggesting something before a person has even finished typing his or her query – is that something Google should be policing?  

Should Autocomplete Exist?

Many people use Google’s autocomplete function on a regular basis and find it helpful. In fact, this study by Rosetta back in 2011 showed autocomplete was one of the most-used features in an eye tracking study.

Autocomplete suggestions are served up by an algorithm that takes into account the popularity of certain search phrases, the localion of the person searching, the freshness of the search query and a person’s previous searches. 

Marketers sometimes use the info in autocomplete to get a sense of what the majority of people are searching for when using a keyword or set of keywords. 

At face value, Google autocomplete doesn’t sound all that bad, until you start talking about important issues like the misogynistic suggestions that are surfacing and referenced in the ads UN Women put out. 

Should Google Censor Autocomplete Then?

Well, Google has and does censor autocomplete and the connected Google Instant feature (serving up results before a person hits enter) at will. There is a rumored blacklist of terms that Google won’t assist searchers in finding results for in autocomplete or in Instant. 

In fact, this website started a list back in 2010 of all the terms that trigger Google to deactivate its autocomplete and Instant feature. You can try a term yourself to see what happens.

google-results-for-bitch
Google states in its help files that autocomplete is monitored in some fashion: 
While we always strive to reflect the diversity of content on the web (some good, some objectionable), we also apply a narrow set of removal policies for pornography, violence, hate speech, and terms that are frequently used to find content that infringes copyrights.
The violation of copyrights is likely referring to when Google put the kibosh on autocomplete suggestions that promote piracy back in 2011. Many speculate this was due to pressure from entertainment giants. 

And then, of course, there are the countless legal battles involving Google for defamation of character in autocomplete. Individuals have sued Google in court across the globe for suggestions they want gone when people search for their name.

Take the cases in Germany, Japan, and France as examples. These prove that some countries don’t agree with Google’s autocomplete functionality in some cases, but something like this has yet to be won in the U.S., where freedom of speech is a closely guarded right.  

In fact, legal experts have speculated that Google would be protected against libel as it relates to autocomplete suits under Section 230 of the Communications Decency Act in the U.S., because Google itself is not making the defamatory remarks, rather it’s collecting and presenting the speech of others.

So What About These Misogynistic Suggestions?

We know Google has the ability to censor autocomplete, and does so on a case-by-case basis, so what about this latest campaign by UN Women? What should Google do about those autocomplete suggestions, and was the UN Women campaign a success?

un-women-ad-women-cannot



No doubt the campaign had a powerful message that helped raise awareness. Countless people, after seeing the campaign, undoubtedly went to Google to try and replicate those results, therefore propagating the issue (since autocomplete suggestions are based in part on how many people search for something). 

Perhaps Google will take a closer look at this campaign and the autocomplete suggestions related to it because of the campaign and resulting spike in interest.

It’s worth noting that not all search results are bad if you start a statement with “women should” and ignore the autocomplete suggestions altogether, then hit enter. Case in point:

womend-should-result-in-google
So this brings up an interesting dichotomy about what is and what could be in the search results by simply following a suggestion. 

In a 2013 study at the university level, researchers explored Google autocomplete as a tool to perpetuate negative stereotypes. From the abstract of the research paper: 
Google was interrogated by entering different combinations of question words and identity terms such as ‘why are blacks…’ in order to elicit auto-completed questions. Two thousand, six hundred and ninety questions were elicited and then categorised according to the qualities they referenced. Certain identity groups were found to attract particular stereotypes or qualities. For example, Muslims and Jewish people were linked to questions about aspects of their appearance or behaviour, while white people were linked to questions about their sexual attitudes. Gay and black identities appeared to attract higher numbers of questions that were negatively stereotyping. The article concludes by questioning the extent to which such algorithms inadvertently help to perpetuate negative stereotypes.
But there are more questions. Are people searching for the answers to these queries because they are trying to gain an understanding into another group they are not familiar with, or are they fueled by hate?

Where to Next?

It’s true, Google may be propagating hate, misunderstanding and stereotypes in its autocomplete suggestions, but it’s a complicated issue. Now that Google is the publisher of content worldwide, many people are struggling to find the answer to just how much it should police its content.

Is it realistic to think Google can play a significant part in changing perceptions around the world through its autocomplete suggestions (or lack thereof)? You decide.

Perhaps marketers have the greatest opportunity to make a change, with their knowledge of how algorithms work. Maybe it’s time we help take control of those results by putting content out there that is thought provoking and balanced for popular search queries.   


All in all, this matter creates more questions than answers, but this is for certain: we have a long way to go as human beings when it comes to understanding one another. Free speech will always be a precious right to many and as long as Google is a dominant player in the dissemination of global content, it will forever be at the heart of issues such as this.  


0 comments:

Post a Comment