this post was submitted on 18 Sep 2024
444 points (94.2% liked)

Technology

59398 readers
2735 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 1 month ago

Oh no. Anyways...

[–] [email protected] 10 points 1 month ago

Anyone who has made copies of videotapes knows what happens to the quality of each successive copy. You're not making a "treasure trove." You're making trash.

[–] [email protected] 10 points 1 month ago (9 children)

Kind of like how true thoughts and opinions on complex topics are boiled down to digestible concepts for others to understand who then perpetuate those concepts without understanding them and the meaning degrades and we dont think anymore, just repeat stuff in social media comments.

Side note... this article sucks and seems like it was ai generated. Repetitive and no author credit? Just says it was originally posted elsewhere.

Generative AI isnt in danger of being killed as this clickbait titled suggests... just hindered.

load more comments (9 replies)
[–] [email protected] 9 points 1 month ago

Having now flooded the internet with bad AI content not surprisingly its now eating itself. Numerous projects that aren't AI are suffering too as the quality of text reduces.

[–] [email protected] 9 points 1 month ago (1 children)
[–] [email protected] 10 points 1 month ago
[–] [email protected] 8 points 1 month ago (4 children)

is it not relatively trivial to pre-vet content before they train it? at least with aigen text it should be.

load more comments (4 replies)
[–] [email protected] 7 points 1 month ago* (last edited 1 month ago) (1 children)

If mainstream blogs are writing about it, what would make someone think that AI companies haven't thoroughly dissected the problem and are already working on filtering out AI fingerprints from the training data set? If they can make a sophisticated LLM, chances are they can find methods to XOR out generated content.

[–] [email protected] 9 points 1 month ago (1 children)

What would make me think that they haven't "thoroughly dissected" it yet is that I'm a skeptic, and since I'm a skeptic I don't immediately and without evidence believe that every industry is capable of identifying, dissecting, and solving every problem with its products.

load more comments (1 replies)
[–] [email protected] 7 points 1 month ago

It's like a human centipede where only the first person is a human and everyone else is an AI. It's all shit, but it gets a bit worse every step.

[–] [email protected] 6 points 1 month ago

Deep fired AI art sucks and is a decade late to the party

[–] [email protected] 5 points 1 month ago

Good riddance.

[–] [email protected] 5 points 1 month ago (1 children)

I was very interested in the thumbnail of this post so I did a little digging and found this: The PDF to the Paper where the whole picture is

[–] [email protected] 3 points 1 month ago

Wow, it's amazing that just 3.3% of the training set coming from the same model can already start to mess it up.

[–] [email protected] 5 points 1 month ago (1 children)

"Model collapse" is just a fancy way of saying "our stupid ideas are bad and nobody wants them."

load more comments (1 replies)
[–] [email protected] 3 points 1 month ago (1 children)

Usually we get an AI winter, until somebody develops a model that can overcome that limitation of needing more and more data. In this case by having some basic understanding instead of just having a regurgitation engine for example. Of course that model runs into the limit of only having basic understanding, not advanced understanding and again there is an AI winter.

load more comments (1 replies)
[–] [email protected] 3 points 1 month ago (1 children)

Our wetware neutral networks probably aren't supposed to engage with synthetic content like this either. In a few years we're gonna learn that overexposure to AI generated content creates some sort of neurological problem in people, like a real-world "nerve attenuation syndrome" (Johnny Mnemonic).

load more comments (1 replies)
[–] [email protected] 3 points 1 month ago
[–] [email protected] 3 points 1 month ago

Sooner or later it is supposed to happen, but I don't think we are quite there....Yet.

load more comments
view more: ‹ prev next ›