Traffic Decreased by 90% Due to the Impact of Large AI Models

Over the past year, a growing number of websites have experienced a sudden, precipitous decline in organic traffic. Fifty-percent drops have become increasingly prevalent, and in extreme cases, traffic has plummeted by as much as 90%. In most instances, this downturn is not attributable to technical glitches or subpar content quality, but to a more profound structural shift: large AI models have fundamentally reshaped the distribution of search traffic.
Today’s AI systems:
  • Answer user queries directly
  • Generate large-scale content summaries
  • Intercept user intent prior to click-through
The result is not vanished traffic—it has been absorbed upstream. For businesses dependent on SEO, the challenge is no longer merely achieving high rankings, but surviving and maintaining visibility within an AI-moderated search ecosystem.
Addressing this challenge demands two parallel strategies: adapting to AI in scenarios where exposure yields benefits, and defending against AI where it threatens business models or data sovereignty.

Part 1: How to Adapt to Large AI Models

Adapting to AI does not mean composing content exclusively for machines. Rather, it entails crafting content that is both user-centric and machine-parsable. In other words, your content must:
  • Remain clear, intuitive, and trustworthy for human readers
  • Be structured to enable AI systems to easily parse meaning, context, and conceptual relationships
More specifically, to maximize both human readability and AI recognition, content should strike a balance between clarity and structured context. This ensures AI systems can interpret interconnections between concepts while readers can effortlessly follow the flow of ideas.
EEAT—Experience, Expertise, Authority, and Trust—has evolved from an abstract guideline into a non-negotiable practical requirement. To optimize for AI recognition:
  • Cite authoritative sources to bolster credibility
  • Substantiate claims with credible, verifiable data
  • Prioritize depth over volume; a small number of high-quality articles outperform dozens of superficial pieces
Optimization has also shifted from a keyword-centric approach to an entity-centric one. AI interprets meaning through entities and their contextual relationships, rather than isolated terms. To thrive in this environment, content should:
  • Explicitly articulate relationships, use cases, and constraints
  • Elaborate on why a topic matters and the specific conditions under which it applies

Part 2: How to Defend Against AI’s Threats to Business Models and Data Sovereignty

Not all AI exposure is beneficial. In many cases, AI-generated summaries:
  • Diminish click-through rates
  • Undermine monetization efforts
  • Extract value without providing any form of compensation
Thus, a defensive strategy is just as critical as an adaptive one.
Key defensive measures include the following approaches:

Avoiding AI Overview Triggers

Research—including Ahrefs’ analysis of 146 million search results—indicates that certain query categories are less prone to AI summarization. These include shopping/product comparisons, real estate searches, local queries, time-sensitive content, and sports-related searches. While not definitive, this insight offers guidance on where SEO value remains more defensible.

Controlling Data Access

Tools like Cloudflare’s AI Crawler Blocking enable publishers to limit unauthorized scraping while preserving legitimate search engine indexing. When combined with stringent robots.txt enforcement and comprehensive crawler activity logging, this approach protects intellectual property and restores balance to the ecosystem.

Leveraging llms.txt

Unlike robots.txt, which targets traditional web crawlers, llms.txt is specifically designed for large language models. It empowers site owners to:
  • Define which URLs AI systems are permitted to access
  • Add descriptive metadata and groupings to clarify content intent
  • Integrate with Open Graph protocols to control visibility and presentation

Diversifying Content Formats

Formats such as images, video, and podcasts enhance content complexity (reducing extractability by AI) while expanding reach across alternative search platforms. This multi-channel approach reduces over-reliance on text-based organic traffic.

Enhancing Personalization and Brand Differentiation

Content featuring implied meanings, second-order reasoning, personal insights, or a strong editorial stance is harder for AI to generalize or commoditize. A distinctive brand voice, a unique conceptual framework, and genuine community engagement create defensible value that AI cannot easily replicate.

Semi-Hidden and Interactive Content

Unlocking insights through user engagement—rather than static content presentation—ensures value remains with human users rather than being fully absorbed by AI. Narrative hooks, thematic tension, and a recognizable linguistic framework further reinforce this defensibility.