this post was submitted on 30 Jan 2024
83 points (86.7% liked)

Technology

58044 readers
3126 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT is leaking passwords from private conversations of its users, Ars reader says | Names of unpublished research papers, presentations, and PHP scripts also leaked.::Names of unpublished research papers, presentations, and PHP scripts also leaked.

top 10 comments
sorted by: hot top controversial new old
[–] Lemminary@lemmy.world 22 points 7 months ago* (last edited 7 months ago) (2 children)

No it's not, it's the site. Please stop reposting this clickbait or at least fix the title.

[–] Sanctus@lemmy.world 9 points 7 months ago (2 children)

It would had to have been trained on their passwords and shit for this to be even possible. It can't even remember its own story points it gave me for a DnD session within the same chat. No way is it spitting out passwords fed to it from one user to another because its not storing them.

[–] cheese_greater@lemmy.world 7 points 7 months ago (1 children)

It would have had to have been

Wow, never realized we had such a weird grammatical construction. What tense is that even called?

[–] jcg@halubilo.social 6 points 7 months ago* (last edited 7 months ago) (1 children)

It's not a single tense (would have - past conditional, had to - past modal, have been - pluperfect), it's a hypothetical past state being caused by a hypothetical past event, but the trick here is that the "past state" is omitted because it's contextually read. If you were giving full context it'd read: "If it was spitting out sensitive information, it would have had to have been trained on it."

Take that, ESL learners!

[–] TheRealKuni@lemmy.world 1 points 7 months ago

"If it was spitting out sensitive information, it would have had to have been trained on it."

I can make it even worse, because the first part of this should be in the subjunctive mood. So it should be:

“If it were spitting out sensitive information, it would have had to have been trained on it.”

[–] cheese_greater@lemmy.world 3 points 7 months ago

It would have had to have been

Wow, never realized we had such a weird construction. What tense is that even called?

[–] notfromhere@lemmy.ml 7 points 7 months ago (1 children)

I’m not following… what site is leaking the information?

[–] kurwa@lemmy.world 3 points 7 months ago

chat.openai.com I'm assuming. But in the article in even says that openai looked into it, and they think it's someone stealing the guys account and using it, not other users conversations being seen by him.

[–] 1984 10 points 7 months ago

So people post their private stuff to chat gpt? I always edit out the personal data.

[–] hedgehog@ttrpg.network 4 points 7 months ago

The user, Chase Whiteside, has since changed his password, but he doubted his account was compromised. He said he used a nine-character password with upper- and lower-case letters and special characters.

Yes, because obviously a 9 character password that’s probably a word or two with special characters swapped and no mention of 2FA is sooo secure /s

To be clear, I’m not saying that means his account was compromised. That bit just stuck out to me.