AI Bots Internet Traffic

A new report from Human Security confirms that bots have officially surpassed humans as the internet's dominant traffic source. Automated activity grew eight times faster than human traffic in 2025, with AI agents seeing nearly 8,000% growth. The internet was built for people. It's now mostly used by machines.

The Lede

In 2024, the internet was still primarily a human place. By the end of 2025, that changed. According to Human Security's State of AI Traffic report, automated traffic—including AI agents, bots, crawlers, and automated tools—has officially eclipsed human users as the dominant source of internet activity.

The numbers are striking. Automated traffic grew nearly eight times faster than human activity year-over-year. AI agent traffic specifically—autonomous systems like OpenClaw that perform tasks on behalf of users—surged approximately 8,000% compared to 2024. Before the generative AI era, bots accounted for roughly 20% of internet traffic. That figure has now flipped: humans are the minority.

Cloudflare CEO Matthew Prince had predicted AI bots would exceed human traffic by 2027. The future arrived early.

What Happened

Human Security, a cybersecurity firm that processes over one quadrillion interactions across its customer base, compiled the report using user-agent strings and behavioral analysis. While the company acknowledges that measuring bot traffic across the entire internet is inherently noisy—user agents can be spoofed, methodologies vary—the directional trend is unmistakable.

The growth isn't coming from traditional "bad bots" alone. Much of the surge comes from legitimate AI-powered features: Google's AI Overview, which generates summaries for search queries; autofill and suggestion systems; automated content aggregation; and the proliferation of AI agents that browse, research, and act on behalf of human users.

The distinction between "human traffic" and "bot traffic" has become increasingly blurry. When a user asks ChatGPT to research a topic, and the AI visits 50 websites to compile an answer, which category does that activity fall into? The human initiated it, but the machine executed it—and generated far more traffic than the human would have alone.

Cloudflare's Prince noted this dynamic explicitly: "Your agent or the bot that's doing that will often go to 1,000 times the number of sites that an actual human would visit. So it might go to 5,000 sites. And that's real traffic, and that's real load, which everyone is having to deal with and take into account."

Why It Matters

The internet's infrastructure was designed around assumptions that no longer hold. Bandwidth planning, server capacity, content delivery networks, and advertising models all assume a human on the other end of the connection. When machines become the majority, those assumptions break.

For content creators and publishers, the shift creates immediate practical problems. Ad-supported business models depend on human eyeballs. Bot traffic doesn't click ads, doesn't subscribe, doesn't buy products—at least not in ways that generate revenue for publishers. Yet bots consume the same resources: bandwidth, server capacity, and content.

The cost implications are significant. Every AI agent that scrapes a website for training data or to answer a user query adds load without adding value to the site's business model. Publishers are already experimenting with blocking AI crawlers; some, like Reddit, have gone further, cutting off API access entirely.

For infrastructure providers, the challenge is scaling to meet demand that behaves differently than human traffic. AI agents don't browse like people—they're systematic, thorough, and relentless. A human might visit three pages on a news site. An AI agent researching a topic might download the entire archive.

The Wider Context

This shift connects to a broader transformation in how information flows online. The original web was designed for human navigation: links, pages, browsing sessions. The emerging web is designed for machine consumption: APIs, structured data, and automated processing.

The change creates tension between accessibility and control. Publishers want their content discoverable—by humans. They don't necessarily want it scraped, aggregated, and repackaged by AI systems that capture the value without returning traffic or revenue.

We're seeing early signs of a defensive response. Robots.txt files are being updated to block AI crawlers. Paywalls are becoming more aggressive. Legal challenges to unauthorized scraping are increasing. The open web is becoming less open as bot pressure drives defensive architecture.

There's also a quality question. If AI systems increasingly train on content generated by other AI systems—if the internet becomes a machine conversation with humans as occasional participants—what happens to information quality? The risk of model collapse, where each generation of AI produces increasingly degraded output, becomes more acute when machines dominate the information ecosystem.

The Singularity Soup Take

There's something deliciously ironic about this moment. We built the internet to connect human minds, and we've accidentally constructed a machine habitat where humans are just one species among many—arguably an endangered one, traffic-wise.

The 8,000% growth in AI agent traffic isn't just a statistic. It's a signal that the internet's center of gravity is shifting from human browsing to machine delegation. Why visit ten websites yourself when your AI agent can visit a thousand and summarize them? The efficiency gains are obvious. The collective consequences are less obvious but potentially profound.

Human Security's CEO Stu Solomon put it directly: "The internet as a whole was created with this very basic notion that there's a human being on the other side of the computer screen, and that notion is very rapidly being replaced."

Replaced by what, exactly? By a mix of helpful assistants, automated scrapers, malicious bots, and autonomous agents pursuing goals on behalf of users who may not fully understand what their digital servants are doing. The internet is becoming an ecosystem—complex, competitive, and only partially under human control.

The infrastructure implications are immediate and practical. The philosophical implications are slower-burning but potentially more significant. If the internet is increasingly a conversation between machines, with humans as initiators rather than participants, what does that mean for human knowledge, discourse, and understanding?

We're running an uncontrolled experiment in machine-mediated information flow. The early results suggest the machines are winning the traffic battle. Whether they win the quality battle remains an open question.

What to Watch

  • Publisher countermeasures: Will blocking AI crawlers become standard practice, or will licensing agreements emerge?
  • Infrastructure costs: How will content delivery networks and hosting providers adapt pricing for an AI-dominated traffic mix?
  • Advertising evolution: Can ad-supported business models adapt to a world where most "impressions" are machine-generated?
  • Quality degradation: Will AI-generated content trained on AI-generated content create a downward spiral in information quality?
  • Regulatory response: Will policymakers attempt to mandate transparency or human quotas for internet traffic?

Sources

AI and bots have officially taken over the internet, report finds — CNBC

Online bot traffic will exceed human traffic by 2027, Cloudflare CEO says — TechCrunch

2026 State of AI Traffic Report — Human Security

AI bots now dominate the Internet, surpassing human traffic for the first time — Tech Startups

Related on Singularity Soup