this post was submitted on 12 Nov 2024
1057 points (96.6% liked)

Technology

59656 readers
2726 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] NaibofTabr@infosec.pub 252 points 2 weeks ago (8 children)

Is "dragged" the new "slammed"?

[–] sunzu2@thebrainbin.org 65 points 2 weeks ago (1 children)

Gen Z journalism entered the chat?

[–] AdamEatsAss@lemmy.world 86 points 2 weeks ago (3 children)

Reporters threw Elon Musk off Hell In A Cell, and plummeted 16 ft through an announcer's table after his chatbot admitted he spread lies.

[–] WindyRebel@lemmy.world 39 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Holy fuck. I miss shittymorph just for his creative responses using this.

[–] hemmes@lemmy.world 26 points 2 weeks ago

Yeah man. Those were the good ol’ days, when X was called Twitter lol. Musk was absolutely spreading misinformation when it was still called Twitter also, before he owned it. I remember when he started talking complete rubbish about Dogecoin, making its price oscillate all over the place that whole week. One of his fanboys bought in…like hard. A 30-something year old, and he put his whole life savings into Doge at its peek, only to lose it all the night it was revealed that in 1998, The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer's table.

[–] RedditRefugee69@lemmynsfw.com 13 points 2 weeks ago (4 children)

Chat, is Elon cooked? No cap?

load more comments (4 replies)
load more comments (1 replies)
[–] pHr34kY@lemmy.world 21 points 2 weeks ago

Where I'm from, "dragged" means to be removed against your will.

You know, like "the pitcher got dragged after the first inning".

[–] CheeryLBottom@lemmy.world 19 points 2 weeks ago

It's a refreshing change of pace

[–] cdf12345@lemm.ee 13 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

Yeah, you know, like “Dragon Deez”

load more comments (4 replies)
[–] zqps@sh.itjust.works 13 points 2 weeks ago (2 children)

I was hoping a horse was involved.

load more comments (2 replies)
[–] glimse@lemmy.world 10 points 2 weeks ago (5 children)

I feel like dragged predates slammed as slang but it definitely wasn't popular headline material

load more comments (5 replies)
load more comments (1 replies)
[–] db2@lemmy.world 156 points 2 weeks ago (40 children)

Implying he gives a shit. The thing about people who lack any empathy is they're immune to embarrassment even when they're the most embarrassing human on the planet.

load more comments (40 replies)
[–] Sam_Bass@lemmy.world 69 points 2 weeks ago (5 children)

misinformation? just call it lies. reads easier and just as accurate.

[–] lvxferre@mander.xyz 36 points 2 weeks ago (3 children)

Even more accurately: it's bullshit.

"Lie" implies that the person knows the truth and is deliberately saying something that conflicts with it. However the sort of people who spread misinfo doesn't really care about what's true or false, they only care about what further reinforces their claims or not.

load more comments (3 replies)
load more comments (4 replies)
[–] andyortlieb@lemmy.sdf.org 56 points 2 weeks ago (2 children)

Chatbots can't "admit" things. They regurgitate text that just happens to be information a lot of the time.

That said, the irony is iron clad.

load more comments (2 replies)
[–] MushuChupacabra@lemmy.world 56 points 2 weeks ago (9 children)

The ultra powerful see us as NPCs, and nothing more.

Your anger is barely a pop up window on the game they're playing.

load more comments (9 replies)
[–] Zementid@feddit.nl 31 points 2 weeks ago (16 children)

Well then they will have to train their Ai with incorrect informations... politically incorrect, scientifically incorrect, etc.... which renders the outputs useless.

Scientifically accurate and as close to the truth as possible never equals conservative talking points.... because they are scientifically wrong.

load more comments (16 replies)
[–] LustyArgonianMana@lemmy.world 22 points 2 weeks ago (13 children)

And we have to ask ourselves WHY he'd want to spread misinformation. What is he trying to do?

load more comments (13 replies)
[–] ATDA@lemmy.world 22 points 1 week ago

He lies to assert power. In his company yesmen say yes because he pays their checks. To the rest of us he generally looks like a loon.

It's obvious to a daft AI.

[–] sunzu2@thebrainbin.org 20 points 2 weeks ago (2 children)

In Texas, we call this lying... I don't know when the goal post got moved but these parasites have always been lying to us the pedons.

Why do peasant accept or listen to these clowns? They are your enemy, treat them as such.

But now... pleb has his daddy who is good, and other pleb's daddy is bad 🤡

"me daddy strong, me daddy kick ur daddy ass"

ADULT FUCKING PEOPLE IN 2024

[–] Cort@lemmy.world 17 points 2 weeks ago (3 children)

I don't know when the goal post got moved

January 22nd 2017. When Kellyanne Conway used the term "alternative facts"

load more comments (3 replies)
load more comments (1 replies)
[–] theluddite@lemmy.ml 15 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

This is an article about a tweet with a screenshot of an LLM prompt and response. This is rock fucking bottom content generation. Look I can do this too:

Headline: ChatGPT criticizes OpenAI

[–] brucethemoose@lemmy.world 11 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

To add to this:

All LLMs absolutely have a sycophancy bias. It's what the model is built to do. Even wildly unhinged local ones tend to 'agree' or hedge, generally speaking, if they have any instruction tuning.

Base models can be better in this respect, as their only goal is ostensibly "complete this paragraph" like a naive improv actor, but even thats kinda diminished now because so much ChatGPT is leaking into training data. And users aren't exposed to base models unless they are local LLM nerds.

load more comments (7 replies)
load more comments (5 replies)
[–] AceFuzzLord@lemm.ee 15 points 1 week ago (1 children)

Come on guys, this was clearly the work of the Demtards hacking his AI and making it call him names. We all know his superior intellect will totally save the world and make it a better place, you just gotta let him go completely unchecked to do it.

/s

load more comments (1 replies)
[–] Roflmasterbigpimp@lemmy.world 13 points 2 weeks ago (2 children)

Damn thats hard. And Melon Husk will soon be the new Chef of Nasa!

[–] HawlSera@lemm.ee 12 points 2 weeks ago (4 children)

Actually they made a new department of "Government Oversight" for him...

Which sounds scummy, but it's basically ju8st a department that looks for places to cut the budget and reduce waste... not a bad idea, except it's Right Wingers running it so "Food" would be an example of frivolous spending and "Planes that don't fly" would be what they're looking to keep the cash flowing on

load more comments (4 replies)
load more comments (1 replies)
load more comments
view more: next ›