AI systems generate a meaningful share of internet activity by turning a single user request into thousands of automated actions. Instead of visiting a few sites, these systems scan and retrieve information across vast numbers of sources in seconds.
Cloudflare CEO Matthew Prince said this pattern could push bot traffic past human traffic by 2027, driven by generative AI systems that continuously gather data at volumes far beyond typical browsing.
This change is about behavior as much as volume.
A person may visit a handful of sites to complete a task. An AI agent can query thousands of sources in seconds to deliver a single answer. That multiplier effect is already placing new demands on infrastructure.
Unlike earlier surges in usage, such as during the pandemic, this pattern shows steady growth with no clear plateau.
Why It Matters: AI-driven traffic creates constant pressure on infrastructure, complicates efforts to determine the intent behind requests, and erodes the reliability of traditional traffic metrics as automated systems take a larger role in how information is accessed and processed online.
- AI Agents Are Expanding Web Activity Far Beyond Human Patterns: Tasks such as shopping or research can trigger automated systems to scan thousands of pages. Prince described a case where a human might visit five sites, while an AI agent could visit 5,000. Each request adds load on servers, turning a single action into a large wave of backend activity. This creates a mismatch between visible user demand and actual system strain, forcing operators to plan for traffic that no longer correlates with human behavior.
- Bot Traffic Is Becoming the Majority of Internet Activity: Before generative AI, bots made up about 20% of internet traffic, led by search engine crawlers. Other bots were often tied to spam or fraud. Now, AI agents from major technology companies are driving a surge in automated requests, collecting data and completing tasks on behalf of users. These systems are legitimate and useful, which makes them harder to limit without affecting real user experiences.
- Traffic Growth Is Continuous and Does Not Level Off: During COVID-19, internet usage rose quickly and then stabilized. The current pattern continues to climb over time. Cloudflare, which supports a significant share of websites, sees ongoing increases across its network, adding sustained pressure on servers and data centers. This steady growth requires long-term investment in capacity instead of short-term scaling fixes.
- New Infrastructure Models Are Needed to Support AI Activity: Prince described a system where lightweight computing environments, or sandboxes, can be created and removed instantly. These environments allow AI agents to run task-specific code and then shut down. At full adoption, millions could be created every second. This approach would change how compute resources are allocated, moving toward highly dynamic execution tied to individual tasks.
- Access to Information Is Becoming Mediated by AI Agents: Users may rely on systems that gather and summarize information instead of visiting websites directly, reducing direct interaction between people and content. This changes how traffic is counted and makes it harder to understand what engagement actually represents, while also reshaping how organizations reach and serve users in a system where machines act as intermediaries.
Go Deeper -> Online bot traffic will exceed human traffic by 2027, Cloudflare CEO says – TechCrunch
Bot Traffic to Overtake Humans Online by 2027, Cloudflare Warns – The Tech Buzz
Trusted insights for technology leaders
Our readers are CIOs, CTOs, and senior IT executives who rely on The National CIO Review for smart, curated takes on the trends shaping the enterprise, from GenAI to cybersecurity and beyond.
Subscribe to our 4x a week newsletter to keep up with the insights that matter.


