this post was submitted on 11 Jun 2025
934 points (99.6% liked)

Technology

71355 readers
3638 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Text to avoid paywall

The Wikimedia Foundation, the nonprofit organization which hosts and develops Wikipedia, has paused an experiment that showed users AI-generated summaries at the top of articles after an overwhelmingly negative reaction from the Wikipedia editors community.

“Just because Google has rolled out its AI summaries doesn't mean we need to one-up them, I sincerely beg you not to test this, on mobile or anywhere else,” one editor said in response to Wikimedia Foundation’s announcement that it will launch a two-week trial of the summaries on the mobile version of Wikipedia. “This would do immediate and irreversible harm to our readers and to our reputation as a decently trustworthy and serious source. Wikipedia has in some ways become a byword for sober boringness, which is excellent. Let's not insult our readers' intelligence and join the stampede to roll out flashy AI summaries. Which is what these are, although here the word ‘machine-generated’ is used instead.”

Two other editors simply commented, “Yuck.”

For years, Wikipedia has been one of the most valuable repositories of information in the world, and a laudable model for community-based, democratic internet platform governance. Its importance has only grown in the last couple of years during the generative AI boom as it’s one of the only internet platforms that has not been significantly degraded by the flood of AI-generated slop and misinformation. As opposed to Google, which since embracing generative AI has instructed its users to eat glue, Wikipedia’s community has kept its articles relatively high quality. As I recently reported last year, editors are actively working to filter out bad, AI-generated content from Wikipedia.

A page detailing the the AI-generated summaries project, called “Simple Article Summaries,” explains that it was proposed after a discussion at Wikimedia’s 2024 conference, Wikimania, where “Wikimedians discussed ways that AI/machine-generated remixing of the already created content can be used to make Wikipedia more accessible and easier to learn from.” Editors who participated in the discussion thought that these summaries could improve the learning experience on Wikipedia, where some article summaries can be quite dense and filled with technical jargon, but that AI features needed to be cleared labeled as such and that users needed an easy to way to flag issues with “machine-generated/remixed content once it was published or generated automatically.”

In one experiment where summaries were enabled for users who have the Wikipedia browser extension installed, the generated summary showed up at the top of the article, which users had to click to expand and read. That summary was also flagged with a yellow “unverified” label.

An example of what the AI-generated summary looked like.

Wikimedia announced that it was going to run the generated summaries experiment on June 2, and was immediately met with dozens of replies from editors who said “very bad idea,” “strongest possible oppose,” Absolutely not,” etc.

“Yes, human editors can introduce reliability and NPOV [neutral point-of-view] issues. But as a collective mass, it evens out into a beautiful corpus,” one editor said. “With Simple Article Summaries, you propose giving one singular editor with known reliability and NPOV issues a platform at the very top of any given article, whilst giving zero editorial control to others. It reinforces the idea that Wikipedia cannot be relied on, destroying a decade of policy work. It reinforces the belief that unsourced, charged content can be added, because this platforms it. I don't think I would feel comfortable contributing to an encyclopedia like this. No other community has mastered collaboration to such a wondrous extent, and this would throw that away.”

A day later, Wikimedia announced that it would pause the launch of the experiment, but indicated that it’s still interested in AI-generated summaries.

“The Wikimedia Foundation has been exploring ways to make Wikipedia and other Wikimedia projects more accessible to readers globally,” a Wikimedia Foundation spokesperson told me in an email. “This two-week, opt-in experiment was focused on making complex Wikipedia articles more accessible to people with different reading levels. For the purposes of this experiment, the summaries were generated by an open-weight Aya model by Cohere. It was meant to gauge interest in a feature like this, and to help us think about the right kind of community moderation systems to ensure humans remain central to deciding what information is shown on Wikipedia.”

“It is common to receive a variety of feedback from volunteers, and we incorporate it in our decisions, and sometimes change course,” the Wikimedia Foundation spokesperson added. “We welcome such thoughtful feedback — this is what continues to make Wikipedia a truly collaborative platform of human knowledge.”

“Reading through the comments, it’s clear we could have done a better job introducing this idea and opening up the conversation here on VPT back in March,” a Wikimedia Foundation project manager said. VPT, or “village pump technical,” is where The Wikimedia Foundation and the community discuss technical aspects of the platform. “As internet usage changes over time, we are trying to discover new ways to help new generations learn from Wikipedia to sustain our movement into the future. In consequence, we need to figure out how we can experiment in safe ways that are appropriate for readers and the Wikimedia community. Looking back, we realize the next step with this message should have been to provide more of that context for you all and to make the space for folks to engage further.”

The project manager also said that “Bringing generative AI into the Wikipedia reading experience is a serious set of decisions, with important implications, and we intend to treat it as such, and that “We do not have any plans for bringing a summary feature to the wikis without editor involvement. An editor moderation workflow is required under any circumstances, both for this idea, as well as any future idea around AI summarized or adapted content.”

top 50 comments
sorted by: hot top controversial new old
[–] Sam_Bass@lemmy.world 24 points 2 days ago (4 children)

Why is it so damned hard for coporate to understand most people have no use nor need for ai at all?

[–] explodicle@sh.itjust.works 23 points 2 days ago (1 children)

"It is difficult to get a man to understand something, when his salary depends on his not understanding it."

— Upton Sinclair

[–] AnyOldName3@lemmy.world 16 points 2 days ago

Wikipedia management shouldn't be under that pressure. There's no profit motive to enshittify or replace human contributions. They're funded by donations from users, so their top priority should be giving users what they want, not attracting bubble-chasing venture capital.

load more comments (3 replies)
[–] BombOmOm@lemmy.world 268 points 2 days ago (9 children)

Why the hell would we need AI summaries of a wikipedia article? The top of the article is explicitly the summary of the rest of the article.

[–] GregorGizeh@lemmy.zip 127 points 2 days ago (5 children)

Even beyond that, the "complex" language they claim is confusing is the whole point of Wikipedia. Neutral, precise language that describes matters accurately for laymen. There are links to every unusual or complex related subject and even individual words in all the articles.

I find it disturbing that a major share of the userbase is supposedly unable to process the information provided in this format, and needs it dumbed down even further. Wikipedia is already the summarized and simplified version of many topics.

[–] thedarkfly@feddit.nl 64 points 2 days ago* (last edited 2 days ago)

There's also a "simple english" Wikipedia: simple.wikipedia.org

load more comments (4 replies)
[–] ricecake@sh.itjust.works 21 points 2 days ago (1 children)

A page detailing the the AI-generated summaries project, called “Simple Article Summaries,” explains that it was proposed after a discussion at Wikimedia’s 2024 conference, Wikimania, where “Wikimedians discussed ways that AI/machine-generated remixing of the already created content can be used to make Wikipedia more accessible and easier to learn from.” Editors who participated in the discussion thought that these summaries could improve the learning experience on Wikipedia, where some article summaries can be quite dense and filled with technical jargon, but that AI features needed to be cleared labeled as such and that users needed an easy to way to flag issues with “machine-generated/remixed content once it was published or generated automatically.”

The intent was to make more uniform summaries, since some of them can still be inscrutable.
Relying on a tool notorious for making significant errors isn't the right way to do it, but it's a real issue being examined.

In thermochemistry, an exothermic reaction is a "reaction for which the overall standard enthalpy change ΔH⚬ is negative."[1][2] Exothermic reactions usually release heat. The term is often confused with exergonic reaction, which IUPAC defines as "... a reaction for which the overall standard Gibbs energy change ΔG⚬ is negative."[2] A strongly exothermic reaction will usually also be exergonic because ΔH⚬ makes a major contribution to ΔG⚬. Most of the spectacular chemical reactions that are demonstrated in classrooms are exothermic and exergonic. The opposite is an endothermic reaction, which usually takes up heat and is driven by an entropy increase in the system.

This is a perfectly accurate summary, but it's not entirely clear and has room for improvement.

I'm guessing they were adding new summaries so that they could clearly label them and not remove the existing ones, not out of a desire to add even more summaries.

[–] azertyfun@sh.itjust.works 23 points 2 days ago (3 children)

Wikimedians discussed ways that AI/machine-generated remixing of the already created content can be used to make Wikipedia more accessible and easier to learn from

The entire mistake right there. Look no further. They saw a solution (LLMs) and started hunting for a problem.

Had they done it the right way round there might have been some useful, though less flashy, outcome. I agree many article summaries are badly written. So why not experiment with an AI that flags those articles for review? Or even just organize a community drive to clean up article summaries?

The questions are rhetorical of course. Like every GenAI peddler they don't have an interest in the problem they purport to solve, they just want to play with or sell you this shiny toy that pretends really convincingly that it is clever.

load more comments (3 replies)
load more comments (7 replies)
[–] johnlukepeckard@lemmy.wtf 42 points 2 days ago

Why would anyone need Wikipedia to offer the AI summaries? Literally all chat bots with access to the internet will summarize Wikipedia when it comes to knowledge based questions. Let the creators of these bots serve AI slop to the masses.

[–] nutsack@lemmy.dbzer0.com 43 points 2 days ago (2 children)

when wikipedia starts to publish ai generated content it will no longer be serving its purpose and it won't need to exist anymore

[–] UnderpantsWeevil@lemmy.world 7 points 2 days ago (2 children)

Too late.

With thresholds calibrated to achieve a 1% false positive rate on pre-GPT-3.5 articles, detectors flag over 5% of newly created English Wikipedia articles as AI-generated, with lower percentages for German, French, and Italian articles. Flagged Wikipedia articles are typically of lower quality and are often self-promotional or partial towards a specific viewpoint on controversial topics.

[–] kassiopaea@lemmy.blahaj.zone 5 points 2 days ago (1 children)

Human posting of AI-generated content is definitely a problem; but ultimately that's a moderation problem that can be solved, which is quite different from AI-generated content being put forward by the platform itself. There wasn't necessarily anything stopping people from doing the same thing pre-GPT, it's just easier and more prevalent now.

load more comments (1 replies)
[–] LWD@lemm.ee 3 points 2 days ago

At least it's only an issue for new articles, which probably have the least editor involvement.

People creating self-promotion on Wikipedia has been a problem for a long time before ChatGPT.

[–] explodicle@sh.itjust.works 4 points 2 days ago

Well, something like it will still need to exist. In which case we can fork because it's all Creative Commons.

[–] jjjalljs@ttrpg.network 95 points 2 days ago (1 children)

I'm so tired of "AI". I'm tired of people who don't understand it expecting it to be magical and error free. I'm tired of grifters trying to sell it like snake oil. I'm tired of capitalist assholes drooling over the idea of firing all that pesky labor and replacing them with machines. (You can be twice as productive with AI! But you will neither get paid twice as much nor work half as many hours. I'll keep all the gains.). I'm tired of the industrial scale theft that apologists want to give a pass to while individuals who torrent can still get in trouble, and libraries are chronically under funded.

It's just all bad, and I'm so tired of feeling like so many people are just not getting it.

I hope wikipedia never adopts this stupid AI Summary project.

[–] laranis@lemmy.zip 25 points 2 days ago

People not getting things that seem obvious is an ongoing theme, it seems. We sat through a presentation at work by some guy who enthusiastically pitched AI to the masses. I don't mean that's what he did, I mean "enthusiasm" seemed to be his ONLY qualification. Aside from telling folks what buttons to press on the company's AI app, he didn't know SHIT. And the VP got on before and after and it was apparent that he didn't know shit, either. Someone is whispering in these people's ears and they're writing fat checks, no doubt, and they haven't a clue what an LLM is, what it is good at, nor what to be wary of. Absolutely ridiculous.

[–] Rooty@lemmy.world 44 points 2 days ago

Good, we don't need LLMs crowbarred into everything. You don't need a summary of an encylopedia article, it is already a broad overview of a complex topic.

[–] FarraigePlaisteach@lemmy.world 120 points 2 days ago

I know one study found that 51% of summaries that AI produced for them contained significant errors. So AI-summaries are bad news for anyone who hopes to be well informed. source https://www.bbc.com/news/articles/c0m17d8827ko

[–] nutsack@lemmy.dbzer0.com 41 points 2 days ago (1 children)

there's a summary paragraph at the top of each article which is written by people who have assholes probably. it's the whole reason to use wikipedia at this point

[–] vithigar@lemmy.ca 29 points 2 days ago (1 children)

This was my very first thought as well. The first section of almost every Wikipedia article is already a summary.

[–] filcuk@lemmy.zip 7 points 2 days ago

Yes, but we didn't emit nearly enough co2 on that one

[–] LWD@lemm.ee 66 points 2 days ago (1 children)

"Pause" and not "Stop" is concerning.

Is it just me, or was the addition of AI summaries basically predetermined? The AI panel probably would only be attended by a small portion of editors (introducing selection bias) and it's unclear how much of the panel was dedicated to simply promoting the concept.

I imagine the backlash comes from a much wider selection of editors.

load more comments (1 replies)
[–] otp@sh.itjust.works 35 points 2 days ago

If I wanted an AI summary, I'd put the article into my favourite LLM and ask for one.

I'm sure LLMs can take links sometimes.

And if Wikipedia wanted to include it directly into the site...make it a button, not an insertion.

[–] slacktoid@lemmy.ml 62 points 2 days ago

I like that they are listening to their editors, I hope they don't stop doing that.

[–] lapping6596@lemmy.world 19 points 2 days ago* (last edited 2 days ago) (1 children)

I get that the simple language option exists, and i definitely think I'm not qualified to really argue what Wikipedia should or should not do. But I wanted to share what my lemmy feed looked like when I clicked into this post and I gotta say, I sorta get it.

[–] stabby_cicada@lemmy.blahaj.zone 20 points 2 days ago* (last edited 2 days ago)

The United States is transitioning into a post-literate society. Teaching kids to read was too hard, and had the ugly side effect of encouraging critical thinking, and that led to liberalism, or worse, Marxism.

So we're using technology to eliminate reading entirely. After all, if you can ask a LLM any question and get a simple answer read to you out loud in simple vocabulary, what more do you need? Are you going to read for pleasure? To fact check? To better yourself? Sounds like ivory tower liberal elitism to me.

[–] drspod@lemmy.ml 38 points 2 days ago (14 children)

Who at Wikimedia is so out of touch that they thought that this was a good idea? They need to be replaced.

load more comments (14 replies)
[–] genuineparts@infosec.pub 13 points 2 days ago

Who could have know, in this day and age, that this would be met with backlash? Truly an unprecedented occurance.

[–] altkey@lemmy.dbzer0.com 14 points 2 days ago

I mean, the LLM thing has a proper field for deployment - it can handle the translation of articles that just don't exist in your language. But it should be a button a person clicks with their consent, not an article they get by default, not a content they get signed by the Wikipedia itself. Nowadays, it's done by browsers themselves and their extensions.

[–] baltakatei@sopuli.xyz 27 points 2 days ago

The main issue I have as an editor is that there is no straightforward way to retrain the LLM to correct faulty training as directly or revertably as the existing method of editing an article's wikicode. Already, much of my time updating Wikipedia is spent parsing puffery and removing phrases like “award-winning” or “renowned”, inserted by malicious advertisers trying to use Wikipedia as a free billboard. If a Wikipedia LLM began making subjective claims instead of providing objective facts backed by citations, I would have to teach myself machine learning and get involved with the developers who manage the LLM's training. That raises the bar for editor technical competency which Wikipedia historically has been striving to lower (e.g. Visual Editor).

[–] danc4498@lemmy.world 13 points 2 days ago (1 children)

On the one hand, it’s insulting to expect people to write entries for free only to have AI just summarize the text and have users never actually read those written words.

On the other hand, the future is people copying the url into chat gpt and asking for a summary.

The future is bleak either way.

[–] winkerjadams@lemmy.dbzer0.com 26 points 2 days ago (3 children)

On the third hand some of us just want to be able to read a fucking article with information instead of a tiktok or ai generated garbage. That's wikipedia, at least it used to be before this garbage. Hopefully it stays true

load more comments (3 replies)
[–] Quik@infosec.pub 24 points 2 days ago (8 children)

Summaries for complex Wikipedia articles would be great, especially for people less knowledgeable of the given topic, but I don't see why those would have to be AI-generated.

[–] theunknownmuncher@lemmy.world 110 points 2 days ago (1 children)

the Top section of each wikipedia article is already a summary of the article

[–] TheTechnician27@lemmy.world 75 points 2 days ago* (last edited 2 days ago)

Fucking thank you. Yes, experienced editor to add to this: that's called the lead, and that's exactly what it exists to do. Readers are not even close to starved for summaries:

  • Every single article has one of these. It is at the very beginning – at most around 600 words for very extensive, multifaceted subjects. 250 to 400 words is generally considered an excellent window to target for a well-fleshed-out article.
  • Even then, the first sentence itself is almost always a definition of the subject, making it a summary unto itself.
  • And even then, the first paragraph is also its own form of summary in a multi-paragraph lead.
  • And even then, the infobox to the right of 99% of articles gives easily digestible data about the subject in case you only care about raw, important facts (e.g. when a politician was in office, what a country's flag is, what systems a game was released for, etc.)
  • And even then, if you just want a specific subtopic, there's a table of contents, and we generally try as much as possible (without harming the "linear" reading experience) to make it so that you can intuitively jump straight from the lead to a main section (level 2 header).
  • Even then, if you don't want to click on an article and just instead hover over its wikilink, we provide a summary of fewer than 40 characters so that readers get a broad idea without having to click (e.g. Shoeless Joe Jackson's is "American baseball player (1887–1951)").

What's outrageous here isn't wanting summaries; it's that summaries already exist in so many ways, written by the human writers who write the contents of the articles. Not only that, but as a free, editable encyclopedia, these summaries can be changed at any time if editors feel like they no longer do their job somehow.

This not only bypasses the hard work real, human editors put in for free in favor of some generic slop that's impossible to QA, but it also bypasses the spirit of Wikipedia that if you see something wrong, you should be able to fix it.

[–] MysticKetchup@lemmy.world 17 points 2 days ago

Yeah this screams "Let's use AI for the sake of using AI". If they wanted simpler summaries on complex topics they could just start an initiative to have them added by editors instead of using a wasteful, inaccurate hype machine

[–] TropicalDingdong@lemmy.world 16 points 2 days ago

The wikipedia is already the processed food of more complex topics.

load more comments (5 replies)
[–] vane@lemmy.world 11 points 2 days ago (2 children)

I bet they will try again.

load more comments (2 replies)
[–] madjo@feddit.nl 16 points 2 days ago (3 children)

Good! I was considering stopping my monthly donation. They better kill the entire "machine-generated" nonsense instead of just pausing, or I will stop my pledge!

[–] jeeva@lemmy.world 8 points 2 days ago

If they have enough money to burn on LLM results, they clearly have enough and I don't need to keep donating mine.

load more comments (2 replies)
load more comments
view more: next ›