Microsoft’s AI failure. She created a survey about the tragic death of a woman
Microsoft published a distasteful survey prepared by an AI system. Under the article about finding the woman’s body, readers were asked what they thought was the cause of her death.
Artificial intelligence is slowly starting to appear in the media, but every now and then we hear about problems with its use in creating content. This time, automatic systems dealt a heavy blow to Microsoft and the renowned newspaper.
AI generated a survey – it concerned speculation about the woman’s death
Microsoft’s MSN news site published an article about the discovery of the body of a young woman in a school in Sydney. The police discovered the body in the bathroom of the facility where she worked as a water polo coach.
However, artificial intelligence did not realize how sensitive the topic was and decided to engage readers more deeply. “What do you think could have caused the woman’s death? Murder, accident or suicide” was the content of the question that appeared under the article.
The reaction didn’t take long. “This is one of the most pathetic and disgusting surveys I have ever seen. The author should be ashamed of himself. It’s nice to know that we can vote in the poll on how this woman died,” Internet users wrote in the comments.
As you might guess, the survey was quickly deleted. There was also a note under it that it was the result of “artificial intelligence observations”. AI has nothing to be ashamed of. The authors of the article are different.
The Guardian accuses Microsoft – the company explains its problematic AI
Originally, the report came from the British website The Guardian, and then was aggregated on MSN. So Guardian Media Group CEO Anna Bateson sent a letter to Microsoft CEO Brad Smith.
Bateson points out that the survey could not only be very upsetting for the family of the deceased, but could also cause “significant reputational damage” to both the Guardian and the journalists responsible for preparing the article.
“This is clearly an inappropriate use of (generative AI) by Microsoft, and in a potentially painful story published in the public interest,” the website’s CEO argues in a letter obtained by The Verge.
The Guardian emphasized that although Microsoft has a license to publish the website’s content on MSN, the publisher of the title had previously asked Microsoft not to use experimental AI solutions around the published content without the company’s express consent.
A Microsoft spokesperson provided a statement saying that the company has deactivated surveys visible next to articles. Currently, the company is also to “investigate the cause of inappropriate content.” “A survey should not appear next to an article of this nature. We will take steps to prevent similar errors in the future,” promised a Microsoft representative.