Engineering for the Agentic Web When 50% of Your Traffic is Robots — Janna Malikova at AI Engineer Melbourne 2026

Original Article Summary
Engineering for the Agentic Web When 50% of Your Traffic is Robots Five years ago, a website's analytics were simple: count the humans who visited. Maybe worry about malicious bots, but mostly you were measuring human attention. Today, something fundamental h…
Read full article at Webdirections.org✨Our Analysis
Janna Malikova's presentation at AI Engineer Melbourne 2026 highlights that 50% of website traffic now consists of robots, marking a significant shift in the way website owners must approach analytics and traffic measurement. This means that website owners can no longer rely solely on traditional analytics methods, as a substantial portion of their traffic is now comprised of AI bots and other automated agents. As a result, website owners must adapt their strategies to account for this change, including implementing more sophisticated tracking and filtering methods to accurately measure human engagement and distinguish it from bot traffic. To effectively manage this change, website owners should take the following steps: implement robust bot detection and filtering in their analytics tools, regularly review and update their llms.txt files to ensure accurate tracking of AI bot traffic, and consider using AI-specific metrics and benchmarks to better understand the impact of automated agents on their website's performance.
Related Topics
Track AI Bots on Your Website
See which AI crawlers like ChatGPT, Claude, and Gemini are visiting your site. Get real-time analytics and actionable insights.
Start Tracking Free →


