Lizzie Clark

Slopsquatting Supply Chain Threat

In this blog series we spotlight one of the stories from our cybersecurity newsletter, Beacon

Security researchers are raising concerns about a potential supply chain cybercrime tactic involving Generative AI, called “Slopsquatting.” This technique exploits a known flaw in GenAI tools – hallucinations, where the AI generates false or non-existent information. In terms of software development, this can include entirely fabricated open-source packages.

Socket reports that many developers now rely on GenAI tools like ChatGPT and GitHub Copilot to assist with coding. These tools can write code directly or recommend packages to include in a project. The issue arises when AI suggests packages that don’t actually exist. According to the research, when the same prompt was run ten times, 43 percent of hallucinated packages appeared every time, while 39 percent never showed up again.

“Overall 58 percent of hallucinated packages were repeated more than once across ten runs,” the report notes, “indicating that a majority of hallucinations are not just ransom noise, but repeatable artifacts of how the models respond to certain prompts.”

At this stage, Slopsquatting is theoretical, with no known attacks. However, the threat is real – cybercriminals could monitor GenAI hallucination, identify the most commonly suggested fake packages, and register them on Open-Source repositories. This means unsuspecting developers could be tricked into downloading and using malicious software.

If you’d like the latest dark web news and insights delivered into your inbox every Thursday at 10am, sign up to the email version of BEACON.