Skip to main content
Founding FuelFounding Fuel

The End of the Pageview: How Agentic AI is Forcing a Reckoning in Publishing

The media industry is confronting a stark new reality: survival in the AI era means abandoning the cybersecurity arms race and pricing content for machines instead of humans

13 April 2026· 7 min read

TL;DR

The publishing industry faces a fundamental reckoning: agentic AI signals the 'end of the pageview' and a 'zero-click' internet. Traditional revenue models, reliant on human attention, are rapidly becoming obsolete. Instead of costly legal battles or an unwinnable cybersecurity arms race against AI bots, business leaders must pivot from an attention economy to a data supply economy. The core insight is that content's primary consumers are now machines, accessing information for Retrieval-Augmented Generation (RAG). Sustainable value lies in designing systems to price and capture revenue from these AI agents at machine speed. This necessitates focusing on distinctive, authoritative content, shedding generic SEO material. Embracing this shift offers a compelling, actionable path for enduring value creation in the AI era.

For decades, the publishing industry’s lifeblood has been human attention—measured in clicks, subscriptions, and time spent on site. But as generative AI evolves, that foundational model is coming under sustained pressure.

In a recent, tightly curated conversation on Zoom, hosted by Founding Fuel, about 22 senior editors, strategists, and founders from across the publishing ecosystem came together to confront this shift. The mood in the room had already moved past early excitement over generative tools. What remained were harder, structural questions: about traffic flows, monetisation, and, ultimately, the survival of the publishing business.

The industry’s traditional defences are already showing signs of strain. Toshit Panigrahi, co-founder and CEO of TollBit, a platform that monitors, measures and helps publishers monetise AI traffic, argued that many of the early responses—lawsuits over copyright, or one-time payouts for training data—are unlikely to offer durable solutions. The models, after all, have already been trained.

Similarly, relying on cybersecurity tools to block scraping bots risks becoming a costly arms race—one that ultimately trains those very bots to get better at evasion.

Where, then, does sustainable value lie? In what is increasingly being described as a “zero-click” ecosystem: Retrieval-Augmented Generation (RAG), where AI systems continuously fetch publisher content to ground their responses and reduce hallucinations.

This shift forces a deeper rethink of where economic value accrues. Publishing is moving, uneasily, from an attention economy to something closer to a data supply economy—where the primary consumers of content are not humans, but machines. At scale, these agents could drive volumes of content access that dwarf anything seen in the pageview era.

Survival in such a world demands sharper choices. The SEO-driven “middle layer” of content—produced largely to capture search traffic—becomes increasingly redundant. What endures is content that is distinctive, authoritative, and difficult to replicate.

If there was one emerging consensus in the conversation, it is this: the path forward may not lie in building higher walls, but in designing systems that can price and capture value at machine speed—every time an AI agent queries the open web.

Here are the key takeaways from the conversation:

1. Framing the Shift

Indrajit Gupta: What structural shifts in publishing are we still underestimating, and where will value accrue going forward?

The Zero-Click Internet

The internet is rapidly moving toward a zero-click world where people may not visit publisher websites anymore, representing a foundational shift distinct from previous transitions like Google, social media, or mobile. AI bots scrape content to answer a user's question directly, eliminating the need for the user to click through to a link.

Emergence of AI as the Primary Consumer

A new autonomous AI visitor has emerged, which will likely become the majority of internet traffic because bots do not get tired. Since AI platforms might read up to 30 articles just to answer a single question, bot traffic could surpass a trillion page views a day, operating as replacement economics in a growing market.

From Training Economics to Retrieval (RAG) Economics

Publishers initially fixated on suing AI companies or securing one-time checks for training data. However, sustainable revenue lies in Retrieval-Augmented Generation (RAG) or inference, where AI models must continuously reference and load full text into their context to ground answers and avoid hallucinations.

Why the cybersecurity arms race to block AI bots is a losing the battle that will ultimately destroy advertiser trust

2. Rethinking the Business Model

Indrajit Gupta: Where and how do you begin rethinking the business model as an independent media company or a large institution?

The Four Vectors of Premium Content

Early data shows that the price a piece of content commands is based on four vectors: freshness (breaking news versus older articles), brand name (marquee publishers establishing trust), paywalls (harder-to-scrape content carries higher liability for AI), and uniqueness.

The SEO Trap

Tactics that worked for SEO actively hurt publishers in an AI economy. If multiple publishers write the exact same articles targeting high-volume keywords, AI companies do not need to cite all of them; they will simply pay a few sources and ignore the rest. The highest leverage comes from authoritative, niche content that no other publisher has written and cannot be easily replaced.

How traditional SEO strategies of producing redundant content will actively hurt your publishing business in the AI era

3. Adoption Lag & Subscription Tension

Suprio Guha Thakurta: If this solution is perfect, why hasn’t everybody signed on, and will it negatively affect existing subscription businesses?

Seller's Remorse and the 2025 Inflection Point

Many publishers delayed adoption because they hoped to negotiate lucrative direct deals with large AI model trainers. However, those deals dried up, and publishers who did sign them experienced seller's remorse upon realising their content was being heavily scraped for RAG, not just one-time training. By early 2025, the industry reached an inflection point, accepting that RAG monetisation is the necessary future.

Navigating the Disruption

The publishing industry is navigating this disruption collectively. Implementing programmatic, machine-speed licensing allows publishers to build a sustainable revenue stream to complement, rather than completely replace, traditional human-centric subscription models.

4. Structural Disruption

Ajay Chacko: Is this just another wave of disruption that will wipe out existing publishing models, especially the generalist ones caught in the middle?

The Collapse of the Middle Layer

The industry is in for severe disruption, and existing cost structures, organisational setups, and employee headcounts will likely go through significant changes. Publishers caught in the middle with shallow, non-specialised content are highly vulnerable.

New Players and Survival

New players are emerging that do not even look like traditional publishers, writing content solely for AI bots to read. While the exact form factors and delivery methods will change, the fundamental societal role of publishers will survive this wave of disruption just as it survived print, radio, TV, and the internet.

5. Incumbent Adaptation & Leadership

Charles Assisi: Are there incumbents who clearly see the writing on the wall, and what is in their DNA that allows them to adapt?

Early-Moving Incumbents

Several major incumbents, including DotDash Meredith, Time Magazine, Axel Springer, and Hearst, are heavily leaning into experimentation to figure out the future economic model and what new content forms will look like.

Top-Down Leadership is Critical

What distinguishes these forward-thinking incumbents is that the initiative comes directly from top-level executives. They do not delegate this to mid-level product managers; leadership clears the path for experimentation because they understand this is an existential, directional shift for the media business.

6. Small Publishers & Regulatory Shifts

Arun Anant: How should small and niche publishers view this, and how do we expect copyright laws to change?

Expansion of Total Content Consumption

Because AI bots read multiple sources to answer a single query, the net amount of content consumption will increase, driving a premium for unique, long-tail content produced by niche publishers. The current technical challenge is ensuring this niche content is easily discoverable by AI systems.

Regulating Bots Over Copyright

Overhauling copyright and IP laws across every jurisdiction is highly complex and would open a can of worms. A much simpler and scalable regulatory solution is to make it strictly illegal for bots to masquerade as humans, accompanied by penalties for misrepresenting their intent.

The simple regulatory fix that could save the web: making it illegal for AI bots to masquerade as humans

7. Platform Dependency

Anuradha Sengupta: What happens to independent journalists whose content lives on freely available, third-party platforms like YouTube or Substack?

The Barrier to Scraping is Gone

Even massive tech entities like Google cannot fully secure their content against AI scraping. With AI agents, the technical barrier to entry has vanished; an agent can autonomously write code to bypass proxies and download videos from platforms like YouTube instantly.

Platform Monetisation Inevitability

Independent creators lack control over these infrastructure layers. However, the platforms themselves are realising the inevitability of this shift, and companies are beginning to work on machine monetisation solutions to compensate creators hosted on their networks.

8. Power Imbalance & Negotiation with LLMs

Deepak Ajwani: Given that LLMs already have the upper hand and the ability to scrape, will they ever agree to pay a toll, or will publishers just isolate themselves by demanding payment?

The Divide-and-Conquer Strategy

AI companies approach negotiations with a divide-and-conquer strategy, selectively doing deals with one or two marquee names in a specific region to prevent market consensus. They act as if they have the upper hand because they have already scraped the content, using this to force better prices.

Creating Leverage Without Scarcity

Publishers cannot rely on cybersecurity to block bots, as that is a losing battle that only leads to rising infrastructure costs. Instead, publishers must accept that scraping is inevitable and build "leverage without scarcity". By transitioning to a fully programmatic licensing system that operates at machine speeds, publishers can offer a reliable, zero-friction access method that AI companies are willing to pay for.

Why publishers must stop relying on firewalls and start building "leverage without scarcity" when dealing with AI giants

9. B2B Connectors vs. Zero-Shot Access

Akhilesh Tilotia: How should a startup that structures financial data for AI weigh the trade-off between building paid API connectors versus relying on a toll-based monetisation approach?

Dual Go-To-Market Strategies

For startups supplying highly specific data to known B2B clients (like financial institutions deploying AI agents), building direct API connectors or SaaS agreements remains a completely valid go-to-market strategy.

The Need for Real-Time Programmatic Tolls

However, a toll-based model is necessary for "zero-shot access". When a generic AI query happens in real-time and hits a paywall from a service it has no prior deal with, there must be a mechanism to programmatically negotiate a price and grant access instantly.

10. Content Poisoning & Generative Engine Optimisation

Shrinath V: With LLMs scraping massive volumes, how do we prevent bad actors from writing "poison content" to damage brand reputations?

Generative Engine Optimisation (GEO)

Brands and political groups are actively investing in "Generative Engine Optimisation" to ensure they are presented favourably by AI. This includes creating synthetic articles or spinning up fake Reddit threads to positively "poison" or arbitrage the AI's understanding.

AI Platform Safeguards

AI platforms are constantly plugging these GEO hacks. Furthermore, negative poisoning is somewhat mitigated by the nature of RAG; an AI model bases its answer on multiple citations (e.g., 12 sources), meaning one poisoned article is generally outweighed by 11 uninfluenced citations.

11. The Pivot to Proprietary Data

Swarup Gupta: What is the future of legacy publishers pivoting away from text toward building proprietary data sets and indices for higher margins?

The Exact Needed Experimentation

Moving toward proprietary data sets, indices, and survey collections to secure higher margins is exactly the type of top-down experimentation publishers should be executing.

Surfacing Unique Value

By accepting that a zero-click world is arriving and that commoditised content will no longer drive traffic, publishers can redirect resources to figure out how to best surface unique data in high-margin form factors.

12. The Political Stakes & Bot Regulation

Dinesh Narayanan: At what point does pushback from Big Tech turn this from a business conversation into a political conflict over the rollback of democracy?

Democracy at Stake

If the economic model of publishing collapses and there is no way to monetise, the incentive to create and share information disappears entirely. Without reliable content generation, democracy itself is put at risk, forcing governments to weigh this threat against Big Tech lobbying.

Why the inability to monetise AI bot traffic poses an existential threat to the future of democracy

The Looming Traffic Explosion

The current volume of bot traffic is only the beginning. As companies like Tesla and Boston Dynamics deploy humanoid robots equipped with internet-connected, GPT-powered consciousness, these machines will constantly scrape the web for real-time information. Regulating bots from misrepresenting themselves is critical before this exponential explosion in traffic occurs.

13. Reimagining Publisher Collaboration

Swaminathan Sivaraman: Does AI necessitate a new model of collaboration among publishers, rather than just focusing on individual monetisation?

AI as the New Aggregator

AI systems have essentially become the new publisher; they consume 10 different articles, synthesise them into a single answer, and hold the direct relationship with the consumer. These systems do not care about arbitrary brand distinctions between competing publishers.

From Silos to Structural Solutions

Because publishers are now trying to survive against powerful AI intermediaries rather than just competing with one another, they must abandon old practices like purely internal cross-linking. The industry needs to get in a room together and explore holistic, structural solutions to collectively serve consumer needs and monetise content.

14. The Value of Originality vs. AI Content

Vivek Y. Kelkar: Does the use of AI to generate original content enhance or take away from a publisher's business model?

The Persistent Demand for Human Quality

There will always remain a dedicated audience segment willing to pay for and read original, human-authored, long-form content from trusted publishers like The Atlantic or the Wall Street Journal.

The Packaging Matters

The controversy surrounding AI-generated content comes down to packaging. If a publisher passes off an AI article as human-written to fool the consumer, it damages trust. However, if AI is transparently used to productively process complex data sets into narratives—a task that would be prohibitively expensive for humans to do—it removes the stigma and enhances the business model.

Optimising for a Third Pillar

Publishers are no longer debating whether training or inference (RAG) is the future. The literature is clear that machine monetisation is the necessary path forward. Publishers must fundamentally reconsider how they adapt and deliver content to a new, non-human audience. Just as publishers currently have distinct teams and economics for print and digital, they must now build a "third pillar" optimised specifically for an agentic AI audience that will vastly outnumber their human visitors.

Founding Fuel aims to create the new playbook of entrepreneurship. Think of us as a hub for entrepreneurs- the go-to place for ideas, insights, practices and wisdom essential to build the enterprise of tomorrow. It is co-founded by veteran journalists Indrajit Gupta and Charles Assisi, along with CS Swaminathan, the former president of Pearson's online learning venture.

Beyond the noise is the signal.

FF Insights: Sharpen your edge, Monday–Friday.
FF Life: Culture, ideas and perspectives you won't find elsewhere — Saturday.

Founding Fuel is sustained by readers who value depth, context, and independent thinking.

If this essay helped you think more clearly, you may choose to support our work.

Illustration of supportersIllustration of supporters

Readers also liked

The Age of Individualism—and Its Unseen Costs
·Economy, Policy & Society

The Age of Individualism—and Its Unseen Costs

John Kay—one of the world’s leading thinkers on economics, corporate purpose, and capitalism—explores why individualism remains so deeply entrenched, even as it fuels inequality, populism, and institutional decay. Part 1 of a two-part conversation

FF
Founding Fuel

Founding Fuel

The Government Works—Just Not How You Think
·Economy, Policy & Society

The Government Works—Just Not How You Think

The government is not as a machine to fix, but a living system to serve: a conversation with Subroto Bagchi, entrepreneur, author, and public servant

FF
Founding Fuel

Founding Fuel