Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft Is Suing People Who Did Bad Things With Its AI

Mar 02, 2025 - futurism.com
Microsoft has updated a lawsuit to name four multinational developers accused of bypassing safety measures and abusing its AI tools to create harmful content, including deepfaked celebrity porn. The defendants, part of the cybercrime network Storm-2139, are identified by their nicknames: Arian Yadegarnia ("Fiz") from Iran, Alan Krysiak ("Drago") from the UK, Ricky Yuen ("cg-dot") from Hong Kong, and Phát Phùng Tấn ("Asakuri") from Vietnam. Microsoft categorizes the network into "creators, providers, and users," who together exploit AI tools for illegal purposes. The lawsuit, initially filed with anonymous defendants, now names some individuals to deter further misuse of AI technology.

Microsoft's legal action aims to dismantle Storm-2139 and prevent future abuse of its AI tools. The company's approach contrasts with other tech giants like Meta, which have opted for open-source AI models. Despite Microsoft's efforts to ensure AI safety, the deregulated environment poses challenges in preventing misuse. The legal pressure has reportedly caused divisions within Storm-2139, but Microsoft acknowledges that litigation alone may not suffice to address AI exploitation, given the evolving legal landscape around AI harm and abuse.

Key takeaways:

  • Microsoft has modified a lawsuit to name four multinational developers accused of bypassing safety guardrails and abusing AI tools to create harmful content.
  • The defendants are part of a cybercrime network called Storm-2139, which is divided into creators, providers, and users who exploit Microsoft's AI tools.
  • Microsoft's legal action aims to stop the defendants' conduct, dismantle their operation, and deter others from misusing AI technology.
  • The case highlights the challenges of regulating AI misuse in a largely self-regulated industry, where legal systems are still adapting to AI complexities.
View Full Article

Comments (0)

Be the first to comment!