On April 25, Google announced a series of updates to it search engine, which it claims will be able promote more “authoritative content” while reducing the prominence of fake or offensive pages.
, the company acknowledged its role in the proliferation of fake news, noting that it has “become very apparent that a small set of queries in our daily traffic (around 0.25 percent) have been returning offensive or clearly misleading content, which is not what people are looking for”.
The power that tech giants like Facebook and Google have to filter the information people consume places a huge responsibility on their shoulders
In December, the company’s image received a serious blow its search engine’s first response to the query ‘did the Holocaust happen?’ was an article entitled ‘Top 10 Reasons Why the Holocaust Didn’t Happen’.
It has become starkly apparent that the power tech giants like Facebook and Google have to filter the information people consume places a huge responsibility on their shoulders. Concern over the fake news phenomenon came to a head during the US presidential election, when it became clear politically charged false content was being widely read and many were actively falling for the stories. By some measures, fake news even outperformed real news in the run up to the election.
In an effort to tackle the issue, Google claims it has adjusted the ‘signals’ its search tool uses. The company believes this tweak, while presented in an extraordinarily vague way, will be effective in reducing the likelihood of unsettling results such as last year’s Holocaust denial scandal.
Google is also looking to rely more heavily on direct feedback tools. According to its blog post announcement: “Starting today, we’re making it much easier for people to directly flag content that appears in both ‘autocomplete’ predictions and ‘featured snippets’.” This change will create more clearly labeled categories so people can directly complain if they come across sensitive or unhelpful content. Such feedback will then be rewired back into Google’s algorithms.
A final change the company presented is an improvement to its search quality rater guidelines, which involves a process of experimentation using in-house ‘evaluators’ who assess the quality of search results and provide feedback. This process, according to Google, will help the company weed out misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories.