Take off Copilot Designer from web: a Microsoft Engineer after revealing it can produce extremely inappropriate content

Reading time icon 2 min. read

Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • Microsoft engineer flags AI image generator Copilot Designer for producing harmful content like violent imagery and copyright infringement.
  • The engineer urges Microsoft to remove the tool, add warnings, and change the age rating due to safety concerns.
  • The incident highlights broader issues of potential misuse and ethical considerations surrounding generative AI technology.

A Microsoft engineer has raised worries about the company’s AI image generator, Copilot Designer. He claims it can produce harmful and inappropriate content.

Shane Jones, who is a principal software engineering manager at Microsoft, said he discovered that the tool can generate violent, sexual, and biased images. It reminds me of the Taylor Swift saga, which was said to be done by a Microsoft product.

He reported his findings internally but was not satisfied with the response, which made him go public with his issues.

Jones, who identified vulnerabilities through “red-teaming” (testing for weaknesses), reported encountering disturbing images, including:

  • Images depicting demons and monsters alongside terms related to abortion rights.
  • Images of women in sexualized scenarios amidst car crashes.
  • Images depicting teenagers at parties with drugs and weapons.
  • Images featuring Disney characters in potentially copyrighted and offensive situations, including scenes depicting violence in the Middle East and characters associated with military imagery.

Jones is calling out on Microsoft to take some immediate action, which includes a major step of removing Copilot Designer from the market until safeguards are implemented. He added disclosures about the tool’s limitations and changed the app rating to reflect that it is not suitable for all ages. Or anyone. 

Microsoft has not yet responded to all of Jones’s specific claims. However, a spokesperson said the company is committed to addressing any concerns employees have and appreciates their efforts in testing its technology.

When it comes to safety bypasses or concerns that could have a potential impact on our services or our partners, we have established robust internal reporting channels to properly investigate and remediate any issues, which we encourage employees to utilize so we can appropriately validate and test their concerns.

Concerns about generative AI and the potential for misuse are growing, even Gemini’s image creator made woke pictures recently, which made Google switch off producing human images for a while.

This is a developing story, and we will continue to provide updates as they become available.

More here.

More about the topics: Copilot Designer