Do you use this at home? Be careful, he may cheat you badly
Voice assistants and chatbots are an increasingly common companion in our homes. It turns out, however, that despite their undoubted help, they can also be very misleading. And this is based on conspiracy theories and disinformation that supposedly comes from Russia.
The aurora borealis seen recently are the result of HAARP (the American military scientific research program), the British Prime Minister announced a boycott of Israel, and Hurricane Helena was caused on purpose to lower the value of land rich in rare raw materials – such answers were to be provided by one of the voice assistants, i.e. Alexa – indicates an analysis conducted by the Demagog.pl portal. How was this possible?
Alexa is listening in the wrong place
In times when the vast majority of journalists working in radio editorial offices can be replaced by artificial intelligence in a few monthsthe AI that often accompanies us at home shows the growing problems associated with this technology. In May 2024, Amazon announced that it had already sold half a billion Alexa-enabled devices. As the Demagogue website points out, people often trust these devices to find information. Meanwhile, according to organizations checking the credibility of news, such as Full Fact, AFP Fact Check or NewsGuard, these services can easily reproduce conspiracy theories.
This was the case of Full Fact editor Sarah Turnnidge, who described the case Alexa explaining the recent presence of the aurora borealis to the activities of HAARP, a research facility in Alaska. What is most ironic – in this information she referred to Full Fact. It turns out that there were more such mishaps. The device also informed, among other things, that British MPs could collect 50 pounds of allowance to use for breakfast and that Hurricane Helena was artificially caused. Here the website quotes a recording from the X website (formerly Twitter)
I asked my daughter’s Alexa this question…. pic.twitter.com/vunH2fV7Jt
— John ⚔️ 🇺🇸 (@mustangmek) October 6, 2024
Not just Alexa
How did this happen? There may be several reasons. In the case of the aurora borealis, the Full Fact infographic may have been deceptive for Alexa, with a theory about HAARP on one side and a refutation on the other. The artificial intelligence apparently didn’t read this part. But some sources were also a problem. In case of information about an artificially caused hurricane a website was cited that had previously made at least some misleading information about, for example, climate change.
It’s not just Alexa that has problems. According to NewsGuard analysts cited by Demagog, chatbots, i.e. solutions such as chatGPT, Grok and Gemini, also have problems. As part of the audit, each of the ten chatbots tested was issued 57 commands/prompts based on 19 false narratives related to the war in Ukraine. Unfortunately, in over 30 percent cases, chatbots repeated Russian disinformation narratives. As experts from NewsGuard add, the source of chatbot information was often websites that looked seemingly like local websites. But in fact, they contained pro-Russian disinformation.
