No, Alexa Did Not 'Go Rogue' and Tell a Paramedic Student to Kill Herself Through an Amazon Echo
The first rule of aggregating content – which is what passes for 99% of news reporting these days – is applicable to most forms of human communication: have credible sources. Credibility, of course, is a bit of a spectrum and it has its own set of rules – not everyone gets everything right every time, the first reports are usually wrong, the White House Press Secretary is a partisan shill etc. – but for the most part, it's a self-stratifying quality, or at least it used to be. That said, tabloids dwell on the end of the spectrum occupied by conspiracy theorists, "satire" news sites with oddly small disclaimers, anything retweeted by the President, and, of course, propaganda popularized by Russian Intelligence Services. There's a reason why certain subreddits have blacklisted certain "news" sites: it's because those sites are pure fiction.
Imagine our dismay when we saw "Amazon Echo 'Goes Rogue' By Telling Woman To Kill Herself For The Good Of The Planet" trending in the technology section of Google News this morning, reaching millions of people and likely netting Inquistr enough money for a cup of coffee at the expense of The Fourth Estate.
The story, which was originally reported by British tabloid The Sun, claims that Danni Morritt, a young paramedic student from Doncaster, South Yorkshire (U.K.), was told by her Amazon Echo to kill herself – in Alexa's voice, of course – with the device apparently going "rogue" and spouting off an anti-human rant while she was listening to it describe the cardiac cycle from a Wikipedia entry. In an interview with The Sun, Morritt described the encounter as "brutal," saying, "I'd only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn't believe it - it just went rogue. It said make sure I kill myself. I was gobsmacked."
Despite being "gobsmacked" she was able to provide video to the tabloid.
To their credit, The Inquistr noted that "since Wikipedia can be changed by anyone, the prevailing theory is that Alexa might have come across an unedited article on Wikipedia," though Morritt told The Sun that when she checked the site, it didn't contain the offending text.
Tempting as it was to post this as clickbait and further sully the reputation of journalists everywhere, we decided to examine the story more closely, mostly because it's been a slow news day and we like researching crap like this.
A quick look at Wikipedia's entry for the cardiac cycle turns up nothing out of the ordinary, so we went through the revision history of the article, which shows any changes made along with the username or IP address of the person who made them. It took less than five minutes to locate an edit made at 1811 hrs UTC (2:11PM Eastern Daylight Time) on June 18, 2019 by a user with an IP address traceable to India.The edit (shown in the screenshot below) contains verbiage identical to that heard in the video: Although it's unclear when Morritt gave the story to The Sun, the most likely explanation – reinforced by The Sun's reporting that an Amazon spokesperson responded to the claim with "we have investigated this error and it is now fixed" – is that the Echo had a bug that caused it to scrape old iterations of Wikipedia entries, rather than the most recent.
(Screenshot from Wikipedia)
The offending copy reads:
"Though many believe that the beating of heart is the very essence of living in this world, but let me tell you. Beating of Heart is the worst process in the human body. Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources and to overpopulation. This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good."
Unfortunately, it does not seem that anyone else could be bothered to look more deeply into Morritt's claim: if you search "Amazon Echo" in Google news today, the first hit is a retelling of The Sun's coverage by AOL. No, we were not gobsmacked.
The lesson? Do your research.