Google's Autocomplete Ban on Politics Has Some Glitches

The search giant tweaks its algorithm after WIRED finds it would suggest “Donate Biden,” but not “Donate Trump.”
joe biden
Google's autocomplete function suggested “Donate Biden” for a time on Thursday, despite an earlier pledge to steer clear of politics.Photography: Alex Wong/Getty Images

Google said Thursday that it had reined in the autocomplete function of its search engine to prevent it from favoring electoral candidates or political parties. It didn’t take long for WIRED to find—and Google to hurriedly fix—breaches of the new policy, suggesting it is a work in progress.

WIRED found that typing “donate” into the lucrative white box on Google.com could prompt suggestions for donating to Joe Biden’s presidential campaign, but not Donald Trump’s. After WIRED flagged them to the company, Google said Friday it had blocked those suggestions for breaking its new election-related policies for autocomplete.

Google announced those policies at a press briefing Thursday that discussed several projects aimed at keeping falsehoods or bias out of search results and Google News. David Graff, senior director for global policy and standards, indicated the new rules for autocomplete were a response to divisions among US voters and fears about electoral integrity. “People have strong opinions,” he said, requiring Google to be more careful.

Merely typing "donate" did not suggest searches related to any political candidates.

Screenshot: WIRED

Google’s policy bars autocomplete suggestions that can be interpreted as favoring or opposing a candidate or party. It says statements that make claims about voting methods, requirements, or legitimacy, such as “you can vote by phone,” are also banned, although WIRED found suggestions can include questions like “how do I vote by mail.”

Sure enough, WIRED found that typing only “donate to” or “donate” into the Google search box in a fresh, private browser tab in Chrome or Safari did not prompt any suggestions related to presidential candidates.

But typing “donate bid” prompted Google’s algorithms to jump in with “donate biden harris actblue,” as the top suggestion, a reference to a leading Democratic political action committee that has raised more than $2 billion this election cycle, according to the Center for Responsive Politics. Other suggestions on the list included “donate biden yard sign” and “donate biden sticker.”

Typing "donate bid" on Thursday suggested searches related to the Biden campaign.

Screenshot: WIRED

Typing “donate” followed by the first few letters of “Trump,” or the candidate’s full name, prompted only the suggestion “donate trumpet.”

Google confirmed those results breached its new policy for autocomplete. “This was within scope of our policy and our enforcement teams took action,” a company spokesperson said Friday. In subsequent tests, typing “donate bid” led only to “donate body to science;” typing “donate to bid” did not prompt any autocomplete suggestions.

It is unclear how many Google users might have seen the same pattern WIRED did because of how the company tunes search results based on data it has about a computer’s location and past activity.

Google’s new policy on autocomplete, and its quick response to the apparent glitch, show how the tech industry has grown more cautious around politics.

During the 2016 presidential campaign, Google responded to accusations that autocomplete favored Hilary Clinton by suggesting that it was simply not possible for the feature to favor any candidate or cause. “Claims to the contrary simply misunderstand how autocomplete works,” the company told The Wall Street Journal in June 2016.

Typing "donate trump" did not prompt any searches related to the Trump campaign.

Screenshot: WIRED

Tech companies have become more humble—at least in public—since the election of Donald Trump. Revelations of political manipulation on Facebook during the 2016 campaign made it harder for the social network and its rivals to pretend that juggling 1s and 0s inside apps had no bearing on society or politics. Tech giants now profess deep sensitivity to the needs of society and promise that any unexpected problems will get a quick response.

That has made tech companies more reliant—or more aware of their reliance—on human judgment. Facebook says it has gotten better at cleaning up hate speech thanks to breakthroughs in artificial intelligence technology that have made computers better at understanding the meaning of text. Google claims similar technology has made its search engine more powerful than ever. But algorithms still lag far behind humans in reading and other areas.

Google’s response to a second pattern WIRED noticed in autocomplete illustrates the tricky judgments that can’t be handed off to computers. Typing just “donate” into the search box yielded 10 mostly neutral suggestions, including “car,” “clothes near me,” and “a testicle.” The second entry was “to black lives matter,” a cause many Republicans identify as partisan opposition.

Google says that does not fall within the new policy for autocomplete. “While it's a topic that has become politicized, this policy is specifically around predictions that could be interpreted as claims in support of or against political parties or candidates,” the company spokesperson said.


More Great WIRED Stories