The secret lives of Google ratersEnlarge (credit: The IT Crowd / BBC)
Something disturbing has been happening to Google’s advertising algorithms.

These are the programs responsible for placing ads in appropriate contexts; serving up travel-related ads to people searching for hotels or music-related ads to people watching the latest Beyoncé video.

But in the UK, government ads for the Royal Navy, the Home Office, and Transport for London recently ran before YouTube videos featuring Holocaust-denying pastor Steven Anderson, who enthusiastically endorsed the man who killed 49 people in Florida’s gay nightclub Pulse.

According to the UK government, its taxpayer-sponsored ads also ran on videos from “rape apologists” and on white supremacist speeches from David Duke.
Google’s business immediately took a hit: prominent European ad agencies cut ties with the company, while AT&T and Verizon cut all video ad buys.

Acknowledging the gravity of the problem, Google assured advertisers and users that it would make sure no ads ran alongside “upsetting-offensive” content.

The company said it was unleashing its army of over 10,000 raters, people who work around the clock to make sure Google’s algorithms don’t return results that are unhelpful, offensive, or downright horrific.
Who are these raters? They’re carefully trained and tested staff who can spend 40 hours per week logged into a system called Raterhub, which is owned and operated by Google.

Every day, the raters complete dozens of short but exacting tasks that produce invaluable data about the usefulness of Google’s ever-changing algorithms.

They contribute significantly to several Google and Android projects, from search and voice recognition to photos and personalization features.
Read 55 remaining paragraphs

Leave a Reply