this post was submitted on 30 Jan 2024
503 points (93.4% liked)

Technology

59424 readers
2939 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 208 points 9 months ago (21 children)

If you paste plaintext passwords into ChatGPT, the problem is not ChatGPT; the problem is you.

[–] [email protected] 65 points 9 months ago (14 children)

Well tbf chatGPT also shouldn't remember and then leak those passwords lol.

[–] [email protected] 60 points 9 months ago (4 children)

Did you read the article? It didn't. Someone received someone else's chat history appended to one of their own chats. No prompting, just appeared overnight.

[–] [email protected] 35 points 9 months ago (1 children)

........ That shouldnt be happening, regardless of chat content

[–] [email protected] 9 points 9 months ago (1 children)

Well, yeah, but the point is, ChatGPT didn't "remember and then leak" anything, the web service exposed people's chat history.

[–] [email protected] 2 points 9 months ago

Well, that depends. Do you mean gpt the specific chunk of lln code? Or do you mean gpt the website and service?

Because while the nitpicking details matter to the programmers fixing it, how much does that distinction matter to you or I, the laymen using the site?

load more comments (2 replies)
load more comments (11 replies)
load more comments (17 replies)