this post was submitted on 21 Feb 2024
289 points (95.0% liked)

Technology

59217 readers
3308 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT has meltdown and starts sending alarming messages to users::AI system has started speaking nonsense, talking Spanglish without prompting, and worrying users by suggesting it is in the room with them

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 8 months ago (1 children)

The person that commented below kinda has a point. While I agree that there's nothing special about LLMs an argument can be made that consciousness (or maybe more ego?) is in itself an emergent mechanism that works to keep itself in predictable patterns to perpetuate survival.

Point being that being able to predict outcomes is a cornerstone of current intelligence (socially, emotionally and scientifically speaking).

If you were to say that LLMs are unintelligible as they operate to provide the most likely and therefore most predictable outcome then I'd agree completely.

[โ€“] [email protected] 2 points 8 months ago

The ability to make predictions is not sufficient for evidence of consciousness. Practically anything that's alive can do that to one degree or another.