this post was submitted on 19 Jul 2025
162 points (96.6% liked)

Tech

1559 readers
152 users here now

A community for high quality news and discussion around technological advancements and changes

Things that fit:

Things that don't fit

Community Wiki

founded 1 year ago
MODERATORS
 

Around the beginning of last year, Matthew Prince started receiving worried calls from the bosses of big media companies. They told Mr Prince, whose firm, Cloudflare, provides security infrastructure to about a fifth of the web, that they faced a grave new online threat. “I said, ‘What, is it the North Koreans?’,” he recalls. “And they said, ‘No. It’s AI’.”

Those executives had spotted the early signs of a trend that has since become clear: artificial intelligence is transforming the way that people navigate the web. As users pose their queries to chatbots rather than conventional search engines, they are given answers, rather than links to follow. The result is that “content” publishers, from news providers and online forums to reference sites such as Wikipedia, are seeing alarming drops in their traffic.

As AI changes how people browse, it is altering the economic bargain at the heart of the internet. Human traffic has long been monetised using online advertising; now that traffic is drying up. Content producers are urgently trying to find new ways to make AI companies pay them for information. If they cannot, the open web may evolve into something very different.

Archive : https://archive.ph/nhrYS

you are viewing a single comment's thread
view the rest of the comments
[–] chromodynamic@piefed.social 10 points 1 day ago (5 children)

Perhaps some kind of fediweb that allows sites to rank other sites for trustworthiness. Then as a user you mark a few sites as trusted, and use their judgement to find more sites.

[–] nthavoc 17 points 1 day ago* (last edited 1 day ago) (4 children)

You mean kind of like how the web was when it first started in the 90's with curated websites and when Yahoo was a thing?

[–] chromodynamic@piefed.social 1 points 17 hours ago (1 children)

Kind of, but with automation. So if you trust site A 90%, and site A trusts site B 90%, then from your PoV, site B has 81% trust* (which you can choose to replace with your own trust rating, if you want).

Could have applications in building a new kind of search engine even.

  • I'm just guessing how the maths would work, it probably requires a little more sophisticated system that that, such as starting sites at 50% and only increasing or decreasing the rating based on sites you already trust.
[–] nthavoc 2 points 17 hours ago

You've got a good idea if you can mitigate other malicious automated processes that will fudge the numbers like what's going on with Google. If you can emphasize true self-policing in this idea like most things did pre-social media slop, you could potentially improve upon an old idea. There's also a search engine that looks really promising in that it filters out corporate or AI sponsored content. I was pleasantly surprised. Someone on Lemmy shared this with me: https://marginalia-search.com/

Lemmy is kind of self-policing in the sense that people can "de-federate". The problems is, not a lot of people want to "de-federate" and treat it like Reddit. Back in the day, you literally got kicked off your ISP for doing something really stupid. Everyone knew who "that guy" was and that person was pretty much left to the wasteland of the internet if they could get a connection back. I don't know how to get that level of self policing back, but worth a shot.

load more comments (2 replies)
load more comments (2 replies)