Generative AI ‘gray bots’ pound websites up to half a million times a day
Generative AI scraper bots target websites 24 hours a day with up to half a million requests for information, according to the latest Barracuda detection data. In a new report, Barracuda threat analysts highlight the relentless behaviour of generative AI (Gen AI) bots, which form part of an emerging category that Barracuda calls “gray bots”. Gray bots are automated programs that are not overtly malicious but which trawl the internet with the aim of extracting information from websites and other web applications.
Barracuda detection data shows that:
- Between December 2024 and the end of February 2025, millions of requests were received by web applications from GenAI bots such as ClaudeBot and TikTok’s Bytespider bot.
- One tracked web application received 9.7 million Gen AI scraper bot requests over a period of 30 days.
- Another tracked web application received over half a million Gen AI scraper bot requests in a single day.
- Analysis of the gray bot traffic targeting a further tracked web application found that requests remained relatively consistent over 24 hours — averaging around 17,000 requests an hour.
“Gen AI gray bots are blurring the boundaries of legitimate online activity,” said Rahul Gupta, Senior Principal Software Engineer, Application Security Engineering at Barracuda. “They can scrape vast volumes of sensitive, proprietary, or commercial data and can overwhelm web application traffic and disrupt operations. Frequent scraping by these bots can degrade web performance, and their presence can distort website analytics leading to misleading insights and impaired decision-making. For many organisations, managing gray bot traffic has become an important component of their application security strategies.”
To defend against Gen AI gray bots and the scraping of information, websites can deploy robots.txt. This is a line of code added to the website that signals to a scraper that it should not take any of that site’s data. However, robots.txt is not legally binding, the specific name of the scraper bot needs to be added, and not every Gen AI bot owner respects the guidelines.
Organisations can enhance their protection against unwanted Gen AI gray bots by implementing bot protection capable of detecting and blocking generative AI scraper bot activity. Advanced features such as cutting-edge AI and machine learning technologies to address the unique threats posed by gray bots, with behaviour-based detection, adaptive machine learning, comprehensive fingerprinting, and real-time blocking, will help to keep this rapidly rising threat at bay.
Other examples of gray bots are web scraper bots and automated content aggregators that collect web content such as news, reviews, travel offers etc.
—-END
Please read the full investigation here – threat spotlight