Lanyado's research revealed that a significant percentage of AI-generated package names were persistent, meaning they were repeatedly recommended by the AI models. This persistence is key for potential attackers, as it increases the chances of the fake packages being downloaded. Lanyado's fake package, for example, received over 15,000 authentic downloads in three months. The research highlights the potential security risks posed by AI-generated software advice and the need for increased vigilance in the tech industry.
Key takeaways:
- Several large businesses, including Alibaba, have unknowingly incorporated a fake software package, hallucinated by generative AI, into their source code.
- Security researcher Bar Lanyado turned the hallucinated software package into a real one as an experiment, which was then downloaded thousands of times by developers.
- Lanyado's experiment aimed to explore the persistence of hallucinated software packages and the potential for them to be used to distribute malicious code.
- The experiment revealed that the names of hallucinated packages are persistent enough to be a functional attack vector, particularly in Python and Node.js languages.