When it was released back in 2004, autocomplete quickly became one of the most loved features provided by the search giant. It helped to streamline search, and became so good at predicting what users want to find that it can sometimes feel invasive.
There’s a fake news problem that is taking advantage of the suggestion feature behind autocomplete. Users that search certain phrases enough times contribute to Google’s autocomplete, and that leads to some problematic searches. A new update promises to address this misuse and provide a better quality search experience.
Google attempts to find what you want the moment you begin to type into the search bar, or into your Chrome browser. When you search for a celebrity’s name, you might have noticed the word “death” appearing in the autocomplete. Even if you know that celebrity isn’t dead, it’s part of a hoax chain that enough users will click on.
There’s also hate speech to raise as well, where false quotes or historical events are inaccurately attributed. Fake facts are a serious problem for the general public.
Policing the Community
Project Owl will allow the community to provide Google with some user feedback to report offensive queries and contribute to the massive database of terms that drive Google search. Users will be able to mark some suggestions as offensive, or fake, and provide Google a more accurate and real-time report of what’s happening on the Web.
The question is whether these tools have a downside we haven’t thought of, such as reporting terms that aren’t offensive but disagree with someone’s ideologies. There may also be businesses who report business names as offensive, in order to try and decrease a competition’s presence on search.
Bio: Reputation Stars was founded by Pierre Zarokian with the goal of better solutions for reputation management that help clients recover.