Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

OpenAI's Bot Crushes Seven-Person Company's Website 'Like a DDoS Attack' - Slashdot

Jan 11, 2025 - tech.slashdot.org
Triplegangers, an e-commerce company specializing in 3D image files of human models, experienced a significant disruption when a bot from OpenAI attempted to scrape its website, causing what CEO Oleksandr Tomchuk described as a distributed denial-of-service (DDoS) attack. The bot made tens of thousands of server requests to download the site's extensive content, including hundreds of thousands of photos and detailed descriptions. This activity overwhelmed the site, which hosts over 65,000 product pages, and led to increased operational costs, including a likely higher AWS bill. The incident highlighted the vulnerability of websites without a properly configured robots.txt file, which Triplegangers initially lacked, allowing the bot to scrape freely.

After updating the robots.txt file and implementing additional defenses like Cloudflare, the scraping ceased. However, the situation underscores the challenge for small businesses in protecting their digital assets, as compliance with robots.txt by AI companies is voluntary. Tomchuk advises other online businesses to proactively monitor for unauthorized access to safeguard their content. This incident serves as a cautionary tale about the potential risks posed by AI bots and the importance of maintaining robust website defenses.

Key takeaways:

  • Triplegangers' e-commerce site was disrupted by a bot from OpenAI, which caused a distributed denial-of-service attack by attempting to scrape the entire site.
  • The attack involved "tens of thousands" of server requests to download hundreds of thousands of photos and detailed descriptions, using 600 IP addresses.
  • Triplegangers' website, which hosts a large database of 3D image files, was knocked offline, and the company anticipates increased AWS costs due to the bot's activity.
  • The absence of a properly configured robots.txt file initially allowed the bot to scrape the site freely, highlighting the need for proactive monitoring and blocking of unauthorized access by website owners.
View Full Article

Comments (0)

Be the first to comment!