this post was submitted on 25 Aug 2024
73 points (98.7% liked)

Technology

1185 readers
237 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution

founded 10 months ago
MODERATORS
 

Language models generate text based on statistical probabilities. This led to serious false accusations against a veteran court reporter by Microsoft's Copilot.

you are viewing a single comment's thread
view the rest of the comments
[–] nicerdicer@feddit.org 3 points 3 weeks ago* (last edited 3 weeks ago)

This is really bad. It shows that LLMs cannot be trusted, since there is no corrective instance above them. According to this video (German), Microsoft, who has been questioned about this incident, claims that the Copilot feature is to be seen as entertainment rather than a search engine or anything what is considered as a serious source. Although search results regarding the victim have been deleted after they had reached out to Microsoft, the same wrong search results re-appeared a few days later.

With the raise of fake news over the last decade, combined with a growing lack of media literacy, such feature can destroy lives, especially when people tend to ignore the sources. A victim barely has any possibility to proof that these facts are wrong and a result of a hallucinating LLM. And even if - the internet doesn't forget. Fake news will be circleing around as well as legit news.

Edit: According to Copilot, among the crimes he supposedly has committed there is an allegation that he is a child molester. That makes it even worse!