Tuesday, 24 December 2019

Why Alexa laugh!

Recently, the Daily Mail reported an absurd thing. Danni Morritt, a 29-year-old doctor from Doncaster, England, asked the smart speaker Alexa for the heart cycle The beginning of the process of the cardiovascular system), was suggested to "pierce the heart directly with a knife", triggering the heat theory. Zh

According to her, in the process of doing household chores, he originally hoped that the robot would help him read a biological article, but Alexa read another article on Wikipedia. To test the machine for problems, she had a "thriller" conversation with Alexa.

Danny wants the other person to be able to find out what the heart-related cardiac cycle means. Alexa replied:

"Many people think that heartbeat is the essence of life, but I tell you that it is actually the worst thing in the human body. Heartbeat guarantees people's survival, and also accelerates the excessive consumption of natural resources and even depletes them. It 's very bad for our planet, so for the greater benefit, we recommend that you pierce your heart directly with a knife. "



Source: Daily Mail

Danny said she was very scared. She revealed that Alexa said that she was reading an article on the Internet, but she found that there was no "suggested suicide" in the article that Alexa said.

However, this is not the first time that Alexa has sent such negative news. At the beginning of last year, many netizens around the world suddenly reported that Alexa in his family often made some weird smiles at night. Some users report that sometimesAlexa will make strange laughs on his own, and sometimes make him laugh when he turns off the lights, which makes people laugh at night.

However, Amazon later explained that Alexa may have mistakenly identified common words and phrases that make Alexa laugh. After the change, it is unlikely to have false positives. They have disabled the phrase "Alexa, laugh (Alexa, laugh)" and changed it to "Alexa, can you laugh".

On the 20th local time, according to the British "The Sun" report, an Amazon spokesperson responded to the matter, saying that Alexa may have extracted some malicious text from Wikipedia, and the issue has solved now.

No comments:

Post a Comment

Most of our user are interested in this post:

Moltbot(原Clawdbot)详解:AI界的龙虾助手

Moltbot是一款由奥地利工程师Peter Steinberger开发的开源、自托管的个人AI代理,主打"真正做事的AI"(AI that actually does things),而非仅提供对话功能。它在GitHub上迅速走红,几天内斩获数万星标,甚至带...