Cloudflare CEO says people aren’t checking AI chatbots’ source links

Date:

Share:


Companies that develop generative AI always make it a point to say that they include links to websites in the answers that their chatbots generate for users. But Cloudflare CEO Matthew Prince has revealed to Axios that search traffic referrals keep plummeting. Publishers are facing an existential threat, he said, because people aren’t clicking through those chatbot links and are relying more and more on AI summaries without digging deeper.

Prince told Axios that 10 years ago, Google sent a publisher one visitor for every two pages it had crawled. Six months ago, the ratio was one visitor for every six pages, and now it’s one for every 18. OpenAI sent one visitor to a publisher for every 250 pages it crawled six months ago, while Anthropic sent one visitor for every 6,000 pages. These days, OpenAI sends one visitor to a publisher for every 1,500 pages, whereas Anthropic sends one visitor for every 60,000 pages.

People have come to trust AI chatbots more over the past few months. The problem for publishers is that they don’t earn from advertisements if people don’t click through links leading to their websites, and that’s why Prince is encouraging them to take action to make sure they’re fairly compensated. Prince said Cloudflare is currently working on a tool to block bots that scrape content for large language models even if a web page already has a “no crawl” instruction. If you’ll recall, several outlets had reported in 2024 that AI companies have been ignoring websites’ Robots Exclusion Protocol, or robots.txt, files and taking their content anyway to train their technologies.

Cloudflare has been looking for ways to block scrapers since last year. But it was only in March when Cloudflare officially introduced AI Labyrinth, which uses AI-generated content to “slow down, confuse, and waste the resources of AI Crawlers and other bots that don’t respect ‘no crawl’ directives.” It works by linking an unauthorized crawler a series of AI-generated pages that are convincing enough but don’t actually have the contents of the site the tool it’s protecting. That way, the crawler ends up wasting time and resources.

“I go to war every single day with the Chinese government, the Russian government, the Iranians, the North Koreans, probably Americans, the Israelis, all of them who are trying to hack into our customer sites,” Prince said. “And you’re telling me, I can’t stop some nerd with a C-corporation in Palo Alto?”



Source link

━ more like this

US deal to have Greenland ‘should and will be made’ as Trump ‘is serious’ – London Business News | Londonlovesbusiness.com

The US President’s envoy to Greenland, Governor Jeff Landry issued a warning on Friday saying that a deal to have the Arctic island...

This handy Apple Watch feature may soon make it to your Pixel Watch

Google appears to be taking a leaf from Apple’s playbook to give Pixel Watch users a handy new feature. The company is reportedly...

Tech Reader Podcast: Why did Apple choose Gemini for next-gen Siri?

Apple's next-gen Siri is still far off, but this week the company announced that it'll be using Google's Gemini AI for its new...

Ukraine to receive ‘highly effective combat planes’ to fight Putin’s drones – London Business News | Londonlovesbusiness.com

The President of the Czech Republic has told President Volodymyr Zelensky that he will provide Ukraine with “highly effective combat planes” that will...

Get $100 off Apple’s Mac mini M4 desktop

The holiday season is fully in the rear view mirror and real life is here to stay. But that doesn't mean the time...
spot_img