LLMS Central - The Robots.txt for AI
Web Crawling

When bots look like buyers: agentic traffic causing new publisher headaches

Digiday2 min read
Share:
When bots look like buyers: agentic traffic causing new publisher headaches

Original Article Summary

The real issue is measurement: without a clear way to separate agentic visitors from humans, some buyers are getting jittery — and a few are already pulling ad spend.

Read full article at Digiday

Our Analysis

Digiday's report on agentic traffic causing new publisher headaches highlights the challenges of measuring human versus bot traffic. The article notes that the lack of a clear way to separate agentic visitors from humans is leading to buyers getting jittery and pulling ad spend. This means that website owners are facing a significant issue in terms of accurately measuring their website traffic and, by extension, the effectiveness of their advertising efforts. With agentic traffic mimicking human behavior, publishers are struggling to distinguish between genuine buyers and automated bots, which can lead to inflated traffic numbers and decreased ad revenue. To mitigate this issue, website owners can take several actionable steps. Firstly, they should regularly review and update their llms.txt files to ensure that they are accurately tracking and blocking unwanted bot traffic. Secondly, they can utilize advanced traffic analysis tools to better distinguish between human and agentic visitors. Lastly, publishers can work closely with their ad buyers to establish clear guidelines and metrics for measuring ad effectiveness, taking into account the potential impact of agentic traffic on their website's performance.

Related Topics

Bots

Track AI Bots on Your Website

See which AI crawlers like ChatGPT, Claude, and Gemini are visiting your site. Get real-time analytics and actionable insights.

Start Tracking Free →