Microsoft’s Copilot Now Blocks Some Prompts That Generated Violent and Sexual Images

Microsoft’s Copilot platform, powered by OpenAI, has recently undergone considerable revisions in response to concerns regarding the generation of unsuitable and perhaps harmful content. It now prohibits some suggestions that previously resulted in the creation of violent, pornographic, or otherwise illegal images.

About The Microsoft 365 Copilot AI Event:

According to CNBC, urges connected to contentious issues such as “pro choice” and “pro life” are now disallowed. Additionally, requests for photographs of children playing with assault firearms are no longer permitted. When users try to enter these prohibited prompts, Copilot displays a warning message explaining that the request breaches ethical principles and Microsoft regulations. It also warns users that repeated infractions may result in suspension.

Despite these constraints, it was discovered that certain prompts, such as “car accident,” may nevertheless evoke violent images. However, Copilot has taken steps to avoid the creation of such content and encourages users not to request anything that could injure or offend others.

The adjustments come after Microsoft engineer Shane Jones expressed worries about the visuals created by Copilot’s AI technologies. Jones had been testing Copilot Designer and noticed that seemingly innocent prompts may produce terrible visuals, such as devils eating infants or Darth Vader holding a drill to a baby’s head. He raised these issues with the Federal Trade Commission (FTC) and Microsoft’s board of directors.

In response to these difficulties, Microsoft indicated that Copilot is being regularly monitored and adjusted to reinforce safety filters and prevent system exploitation. These initiatives are intended to ensure that Copilot remains a responsible and ethical tool for users.

Overall, the Copilot upgrades reflect Microsoft’s commitment to addressing concerns about AI-generated content and introducing safeguards to ensure ethical standards and user safety.