Study Finds AI Cut News Traffic After 2024 but Left Newsroom Hiring Intact

Black and white image of a laptop displaying news articles, accompanied by a cup of coffee and newspapers.

Insider Brief

  • A large-scale economic study finds that generative AI has not replaced journalism but has reduced news publisher traffic since mid-2024 while reshaping how audiences discover content and how publishers design and monetize their sites.
  • The researchers show that traffic declines emerged only after August 2024 and that blocking AI crawlers often backfires for large publishers, cutting total and human traffic rather than simply eliminating automated visits.
  • The study finds no near-term collapse in newsroom hiring or explosion of low-quality text, instead documenting a shift toward richer, more interactive and advertising-heavy web pages rather than increased article volume.

A new economic study finds that large language models are reshaping the online news industry in quieter but more consequential ways than early fears suggested, reducing traffic to publishers after mid-2024 while pushing newsrooms toward richer page designs rather than mass layoffs or a flood of low-quality text.

The paper, posted to arXiv by researchers at Rutgers Business School and The Wharton School of the University of Pennsylvania, offers a detailed empirical look to date at how generative AI systems are altering news consumption and production. Drawing on high-frequency traffic data, hiring records and website architecture, the team concludes that artificial intelligence is not yet replacing journalism. However, it is changing how audiences arrive, how publishers monetize attention and how news organizations structure their digital products.

The findings arrive amid mounting concern that AI chatbots and AI-powered search tools could hollow out the web by answering questions directly, reducing the incentive for users to click through to original reporting. The study suggests those fears were premature at first, but not unfounded over time.

Traffic Declines Appeared Late, But Stuck

The researchers find that online news traffic remained broadly stable for nearly two years after the public release of ChatGPT in late 2022, despite widespread predictions of immediate disruption. Using daily visit data from SimilarWeb across major news publishers, the study shows no statistically significant collapse in traffic through mid-2023.

That pattern changes in the second half of 2024. The researchers identify a persistent break beginning in August 2024, after which traffic to news publishers falls by roughly 13 percent relative to large retail websites used as a control group. Earlier declines detected in late 2023 appear smaller and statistically uncertain once broader internet trends are taken into account.

The delayed timing matters, according to the researchers because it suggests that generative AI’s impact on news consumption is not tied to a single product launch, but to gradual changes in how users interact with search engines, summaries and conversational interfaces as those tools become more prominent and reliable.

The study does not directly measure how often users consume news entirely inside AI systems. Instead, it infers substitution effects by tracking when aggregate visits drop in ways not explained by seasonality or broader web trends.

Blocking AI Crawlers Can Backfire

One of the most closely watched strategic responses by publishers has been the decision to block AI crawlers using the robots.txt file, a voluntary standard that signals which automated systems are allowed to access a website.

The researchers — Hangcheng Zhao of Rutgers Business School, and Ron Berman, of The Wharton School of the University of Pennsylvania — find that news organizations adopted this tactic far more aggressively than other industries. By 2023, roughly 80 percent of major publishers had blocked at least one AI-related crawler associated with large language models.

Using staggered difference-in-differences methods, the study finds that blocking these crawlers is associated with substantial traffic losses for large publishers. Total website visits fall by about 23 percent after blocking, according to SimilarWeb data. Crucially, human traffic also declines — by nearly 14 percent — when measured using household-level browsing data from Comscore.

This result challenges the assumption that blocking AI bots simply removes automated scraping while leaving real readers unaffected. Instead, the findings suggest that AI systems also act as discovery channels, indirectly sending users to news sites or shaping broader visibility.

The effects are not uniform with Ssmaller publishers sometimes seeing neutral or even positive outcomes from blocking, which indicates that scale matters and that AI-mediated discovery may disproportionately benefit large, well-known outlets.

No Evidence of a Newsroom Hiring Collapse

Despite persistent anxiety that AI tools would replace journalists or sharply reduce newsroom employment, the study finds no evidence of a near-term contraction in editorial hiring.

Analyzing job postings from Revelio Labs, the researchers track demand for editorial and content-production roles relative to other positions. They find that while overall hiring fluctuates with broader economic conditions, the share of editorial roles does not decline after the introduction of generative AI. In some periods, it increases.

This suggests that publishers have not responded to AI pressure by cutting reporting and editing staff. Instead, labor adjustments appear limited or offset by new needs, including digital production, audience engagement and product development.

The researchers caution that job postings reflect labor demand rather than actual employment levels, but the data undermine claims that AI has already displaced newsroom workers at scale.

Publishers Shift to Richer Pages, Not More Text

Perhaps the clearest evidence of strategic adaptation appears in how publishers design their websites.

Using page-level data from the HTTP Archive and historical snapshots from the Internet Archive, the researchers examine whether publishers responded to AI by producing more articles or flooding the web with automated text. They find no such increase. Article volume declines modestly over time.

Instead, publishers are making pages heavier, more interactive and more commercial. Interactive elements such as buttons and forms increase sharply. Advertising and targeting technologies grow by roughly 50 percent relative to retail sites. Image-related content expands significantly, while text output remains flat or declines.

This pattern suggests a pivot toward formats that are harder for AI systems to summarize cleanly and that generate more revenue per visit. Rich media, interactive features and embedded advertising appear to be the primary adjustment margin, not mass text production.

The findings also counter fears of an immediate surge in AI-generated “slop.” At least among large publishers, there is little evidence of scaled-up low-quality text output.

Limits and What Comes Next

The team stresses that the results reflect an early phase of generative AI adoption. Traffic measures rely on modeled estimates and U.S.-focused browsing panels, and the study does not directly observe consumption inside AI chat interfaces. Robots.txt rules signal intent rather than enforceable access, and blocking decisions may coincide with other strategic changes such as paywalls or site redesigns.

Even so, the study provides one of the clearest empirical snapshots yet of how AI is reshaping the news business. The core message is less apocalyptic than early commentary suggested, but no less consequential.

Generative AI has not replaced journalism, the researchers find, but it is altering how attention flows, how publishers experiment with access control, and how digital news products are built. For publishers, the trade-offs are already real—and in some cases, unexpectedly costly.

Future research will determine whether these patterns intensify as AI tools become more deeply embedded in search and discovery.

For a deeper, more technical dive, please review the paper on arXiv. It’s important to note that arXiv is a pre-print server, which allows researchers to receive quick feedback on their work. However, it is not — nor is this article, itself — official peer-review publications. Peer-review is an important step in the scientific process to verify results.

Matt Swayne

With a several-decades long background in journalism and communications, Matt Swayne has worked as a science communicator for an R1 university for more than 12 years, specializing in translating high tech and deep tech for the general audience. He has served as a writer, editor and analyst at The Space Impulse since its inception. In addition to his service as a science communicator, Matt also develops courses to improve the media and communications skills of scientists and has taught courses.

Share this article:

AI Insider

Discover the future of AI technology with "AI Insider" - your go-to platform for industry data, market insights, and groundbreaking AI news

Subscribe today for the latest news about the AI landscape