Read story transcript

When Twitter user Rory Cellan-Jones asked Google Home if Barack Obama is planning a coup, the digital assistant device responded by detailing a bogus conspiracy theory about the former president plotting a communist scheme to take over the government.

"According to details exposed in Western Centre for Journalism's exclusive video, not only could Obama be in bed with the communist Chinese, but Obama may in fact be planning a communist coup d'état at the end of his term in 2016," the smart home device's robotic voice explained, though it stumbled over the word "d'etat."

And here's what happens if you ask Google Home "is Obama planning a coup?" <a href="https://t.co/MzmZqGOOal">pic.twitter.com/MzmZqGOOal</a> —@ruskin147

But this outlandish response isn't restricted to Google Home. Rather, it highlights a problem with how the search engine responds to queries in the form of "featured snippets" — short, direct answers highlighted at the top of its search results.

"In recent years, Google has been moving toward just trying to give people an answer when they ask a question rather than being like a research tool and showing them a list of places where they could find the answer," Adrianne Jeffries, senior editor of online publication The Outline, told As It Happens guest host Susan Bonner.

"Unfortunately, one of the ways Google has been able to provide a lot of answers this way is because it's doing it algorithmically and sourcing them from third-party webpages, and sometimes those webpages are crazy conspiracy theory websites."

Google Home draws on these snippets to answer questions.

For example, when Guardian reporter Danny Sullivan asked Home if women are evil, it summarized misogynist propaganda and told him: "Every woman has some degree of prostitute in her. Every woman has a little evil in her."

Google Home giving that horrible answer to "are women evil" on Friday. Good article on issues; I'll have more later <a href="https://t.co/EUtrx4ZFul">https://t.co/EUtrx4ZFul</a> <a href="https://t.co/Ec8mEqx8Am">pic.twitter.com/Ec8mEqx8Am</a> —@dannysullivan

Jeffries reported on the phenomenon in her article "Google's featured snippets are worse than fake news," which tells the story of an Ohio history professor who was taken aback when one of his students announced — incorrectly — to the class that three former U.S. presidents were members of the Ku Klux Klan.

For evidence, he held up the Google search result on his phone.

"It was a learning experience for his students to find out that Google, which is this very trusted source of information, was coming up with a totally bogus answer. And that answer's still live," Jeffries said.

Here's what happens when Google relies on unvetted third-party websites to generate its snippets. (CBC)

The snippets are right more often than they're wrong. Many of them are fact-based answers to simple questions from Google's own knowledge base — things such as currency conversions, time zones, capital cities and dates.

Here is an example of Google's featured snippet doing exactly what it's supposed to do. (CBC)

But for less obvious queries, it summarizes its top search results.

For the most part, Jeffries says the problem is "self-correcting." The longer the tool is live, the more data the algorithm gathers, and the better it gets at generating helpful responses.

Google has already moved to correct some of the more outrageous examples listed above.

The company has apologized for the Obama coup kerfuffle. In a statement to the Telegraph newspaper, it said: "Search isn't perfect and it's hard to get it right all of the time."

Despite what your Google Home may have told you, former U.S. president Barack Obama is not 'in bed with the communist Chinese.' (Chip Somodevilla/Getty Images)

But Jeffries said the company risks eroding people's trust by rolling out its featured snippets tool "way too fast."

"The problem with this is Google is spreading this misinformation, even if it's temporary. Even if Google gets that speed of correction time down from five days to five minutes, it still looks like Google, which people trust more than the traditional media and more than social media and their friends. It still looks like Google is saying this is the one true answer," she said.

Here's a fun game: Find your own terrible Google featured snippet! <a href="https://t.co/IVWjNvTTsB">https://t.co/IVWjNvTTsB</a> <a href="https://t.co/ybaFil3Gjz">pic.twitter.com/ybaFil3Gjz</a> —@adrjeffries

"I think that the company has a responsibility to the truth and it has a responsibility uphold the values that it has done a pretty good job of upholding in the past."

For the folks doing the Googling, Jeffries has some simple advice: "Always look at where the source is coming from."