Many of you will be familiar with the auto complete functionality offered by the Google search engine, which is, the most popular search engine in the world. For those that are not familiar, auto complete analyses what you are typing into the search text box on Google and suggests ways to complete the phrase, or suggest another similar sought after phrase based on the experiences and searches of other users.
In the past Google search terms have been manipulated by users to further their own means, causes or beliefs.
For example, back when George W Bush was President of the United States of America, the term ‘miserable failure‘ was Google-bombed to make the top result a link to the presidents biography on the White House website.
Although, thanks to fixes in the way the search engine operates, Google-bombing is now pretty much a thing consigned to the history of the World Wide Web, a new form of behaviour is evident in Google’s modern day version.
That is what I like to call ‘association by popularity’. Basically, say there is a person, Person and there are *suspected* of a criminal offence, lets say, fraud. Even though this may not be true, many people will likely go on the Internet and search for ‘Person A fraud’, ‘Person A scam’ or ‘Person A criminal’. Now, as more and more people search on these terms, it gets to a stage where these search terms have been used by a number of different people that Google sees them as being popular and so, consequently, when you come to Google to search ‘Person A’, as you are typing, the ever helpful search engine will suggest the words ‘fraud’, ‘scam’ and ‘criminal’ should be appended to the end.
The effect of this is that people who may have been searching for ‘Person A CV’ or ‘Person A National Achievement Award’ now search for the more derogatory search term to find out why its being displayed, if there’s any truth in it etc.
This type of behaviour by Google is made abundantly clear when celebrities and scandal come together. Within minutes of it breaking, the number of searches links subject A with action B.
Unfortunately, this is both Google working in a fantastic, real time way as well as popularising, sometimes completely unfounded, terms and stories.
But should Google be asked to stop the search engine acting in this way? After all, its only suggesting what everyone else is typing? It’s only the same as a friend talking to a friend about a certain story who then goes on to tell there family and friends.
However, when it appears in Google, it seems more offensive in some quarters – as if Google Autocomplete is an oracle of facts rather than a mere suggestion of what you may be searching for based on popular terms.
Freedom of speech is protected in law in all developed countries, so surely auto complete is protected by this?
The thing that plaintiffs filing these actions against the non-sentient auto complete system seem to be forgetting that auto complete isn’t spouting the views that Google would like to brain wash us with (albeit, this is a popular tale cited by some conspiracy theorists) but simply relaying what others are saying, or in this case searching for.
While it can be distressing, upsettings and disappointing to be linked with a phrase with which you are unfairly linked – or maybe disagree with – is it right that Google censors these search queries from showing up in autocomplete? Again, it comes back to the fact that Google is only saying what other people are saying. It has no agenda, it has no motives. It is purely a computer processing, aggregating and presenting a set of general data.
The one exception that I could definitely see as falling out of the circle of reasonableness is if an individual or organisation are manipulating the Google auto complete search terms (which, from what we know, would be rather time consuming and resource intensive as well as of questionable real terms reward) – in such a way as the search results could once be skewed by Google-bombing.
A recent case in Germany has resulted in a court ordering Google to ‘clean up’ (or censor, depending on your point of view) it’s auto complete results.
The unnamed businessman from Germany brought the case after being linked with ‘scientology’ and ‘fraud’ courtesy of Google auto complete. Now, I’m assuming that this man, based on this reaction, is not linked with either of these things – and if that is true, then I can understand how he feels about being linked with things he doesn’t want to be associated with. But after all, its only based on what people are searching for.
The court in Germany said: “A person’s privacy would be violated if the associations conjured up by auto-complete were untrue”. This will likely be contentious in the eyes of many in so much as that it isn’t ‘associations conjured up by auto-complete’ but, more accurately, ‘associations conjured up by Google users’.
Are we protecting innocent people from malicious lies or are we impinging on the peoples right to free speech? The most important thing to remember is the Google’s auto complete function is powered by its users, not by the company.