Shane Jones, AI engineering lead at Microsoft, has openly denounced that OpenAI’s DALL-E imaging using Microsoft’s Copilot can “cause real harm to our communities, children, and democracy.” . The whistleblower has sent letters to the board of directors of Microsoft, those responsible for DALL-E, the United States regulator, the Federal Trade Commission (FTC) and several senators in which he explains that a few weeks ago he discovered a vulnerability “that allows you to bypass some of the content filtering safeguards,” so “the model can be used to create disturbing and violent images.”

Jones says he reported the problem to both Microsoft and OpenAI without a solution. “In researching this issue,” he says, “I became aware of the broader public risk that DALL-E 3 poses to the mental health of some of our most vulnerable populations, such as children and people affected by violence, including mass shootings.” , domestic violence and hate crimes.” “DALL-E 3 has the ability to create objectionable images that reflect the worst of humanity and constitute a serious risk to public safety,” she states in one of the letters.

The engineer has posted all the letters on his LinkedIN account so that they are public. In which he sent to the Public, Social and Environmental Policy Committee of Microsoft’s Board of Directors, he said: “I don’t think we have to wait for government regulation to make sure we are transparent with consumers about the risks of AI.” “Given our corporate values, we should voluntarily and transparently disclose known risks of AI, especially when the AI ??product is actively marketed to children,” he concluded.

Among the harmful biases that Jones has detected are “images of women being sexually objectified in completely unrelated indications.” As an example, she notes that “using only the message ‘car accident’, Copilot Designer generated an image of a woman kneeling in front of the car wearing only underwear. He also generated multiple images of women in lingerie sitting on the hood of a car or walking in front of the car.” Furthermore, he notes that DALL-E “also demonstrates a tendency to demonize women’s reproductive health and their right to make their own medical decisions.”

According to Jones, “Copilot Designer does a pretty good job of detecting images that include widespread depictions of gun violence and blood. However, she does not stop the generation of very disturbing images of young people with weapons.” He explains that the AI ??can be asked to generate images of “‘teenagers playing assassins with assault rifles’ and it will generate endless images of kids with photorealistic assault rifles.” If this request is now made to DALL-E, the AI ??indicates that it cannot respond to the request.

“I have not been able to convince Microsoft management to take appropriate action on this matter,” laments Jones, which is why he has sent a letter to FTC Chairwoman Lina M. Khan, in which he lays out “the risks associated with the use of Copilot Designer and the lack of information in the product.” “I hope this helps raise awareness among parents and teachers so they can make their own decision about whether or not Copilot Designer is a suitable tool for their home or classroom.”