Designers have to move from the surface to the substrate

Original Article Summary
Designers once controlled 85% of user experience through interfaces and interactions. Now, with intelligence moving beneath the surface, that control has collapsed to 5%—while the real design decisions happen in training data, system prompts, and model behavi…
Read full article at Suffsyed.com✨Our Analysis
The shift in design control from surface-level interactions to underlying substrates like training data and model behavior has significant implications for website owners. As AI bots become more prevalent, understanding how they interact with and interpret website content is crucial. With designers now influencing only 5% of the user experience, website owners must adapt to ensure their sites remain discoverable and compatible with AI-driven systems. To stay ahead, consider the following strategies: review your website's llms.txt file to ensure it accurately reflects your content and intentions, optimize your site's metadata to improve AI bot comprehension, and invest in AI literacy training for your design and development teams to better navigate this new substrate-driven landscape. By taking these proactive steps, website owners can maintain visibility and control in a world where surface-level design is no longer the primary driver of user experience.
Track AI Bots on Your Website
See which AI crawlers like ChatGPT, Claude, and Gemini are visiting your site. Get real-time analytics and actionable insights.
Start Tracking Free →


