Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

AI hallucinates software packages and devs download them – even if potentially poisoned with malware

Mar 28, 2024 - theregister.com
Security researcher Bar Lanyado has demonstrated how AI-generated "hallucinations" of software packages can be exploited to distribute malware. Lanyado created a fake software package, huggingface-cli, after noticing it was repeatedly recommended by AI models despite not existing. The package was subsequently downloaded thousands of times, including by major businesses such as Alibaba. Lanyado's experiment aimed to show how AI-generated package names could be used to distribute malicious code, with the potential for disastrous results if the packages were laced with malware.

Lanyado's research revealed that a significant percentage of AI-generated package names were persistent, meaning they were repeatedly recommended by the AI models. This persistence is key for potential attackers, as it increases the chances of the fake packages being downloaded. Lanyado's fake package, for example, received over 15,000 authentic downloads in three months. The research highlights the potential security risks posed by AI-generated software advice and the need for increased vigilance in the tech industry.

Key takeaways:

  • Several large businesses, including Alibaba, have unknowingly incorporated a fake software package, hallucinated by generative AI, into their source code.
  • Security researcher Bar Lanyado turned the hallucinated software package into a real one as an experiment, which was then downloaded thousands of times by developers.
  • Lanyado's experiment aimed to explore the persistence of hallucinated software packages and the potential for them to be used to distribute malicious code.
  • The experiment revealed that the names of hallucinated packages are persistent enough to be a functional attack vector, particularly in Python and Node.js languages.
View Full Article

Comments (0)

Be the first to comment!