this post was submitted on 31 Aug 2023
596 points (97.9% liked)

Technology

59658 readers
2632 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I'm rather curious to see how the EU's privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn't have a paywall)

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 1 year ago (1 children)

This is an article about unlearning data, not about not consuming it in the first place.

LLM's are not storing learned data in it's raw, original form. They are injesting it and building an understanding of language based off of it.

Attempting to peel out that knowledge would be incredibly difficult, if not impossible because there's really no way to identify it.

[–] [email protected] 4 points 1 year ago (2 children)

And we're saying that if peeling out knowledge that someone has a right to have forgotten is difficult or impossible, that knowledge should not have been used to begin with. If enforcement means big tech companies have to throw out models because they used personal information without knowledge or consent, boo fucking hoo, let me find a Lilliputian to build a violin for me to play.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Okay I get it but that's a different argument. Starting fresh only gets you so far. Once am LLM exists and is exposed to the public users can submit any data they like and the LLM has no idea the source.

You could argue then that these models shouldn't be able to use user submitted data but that would be a devastating restriction to the technology and that starts to become a question of whatever we want this tech to exist at all.

[–] [email protected] 0 points 1 year ago

If enforcement means big tech companies have to throw out models because they used personal information without knowledge or consent, boo fucking hoo

A) this article isn't about a big tech company, it's about an academic researcher. B) he had consent to use the data when he trained the model. The participants later revoked their consent to have their data used.