this post was submitted on 14 Aug 2024
-62 points (17.0% liked)

Technology

59068 readers
3409 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 2 months ago* (last edited 2 months ago) (1 children)

These things are like arguing about whether or not a pet has feelings...

I'd say it's far more likely for a cat or a dog to have complex emotions and thoughts than for the human made LLM to actually be thinking. It seems to me like the nativity of human kind that we even think we might have created something with consciousness.

I'm in the camp that thinks the LLMs are by and large a huge grift (that can produce useful output for certain tasks) by virtue of extreme exaggeration of the facts, but maybe I'm wrong.

[โ€“] [email protected] -3 points 2 months ago

These things are like arguing about whether or not a pet has feelings...

Mhm. And what's fundamentally wrong with such an argument?

I'd say it's far more likely for a cat or a dog to have complex emotions and thoughts than for the human made LLM to actually be thinking.

Why?

I'm in the camp that thinks the LLMs are by and large a huge grift (that can produce useful output for certain tasks) by virtue of extreme exaggeration of the facts, but maybe I'm wrong.

Why?

I too see how grifters use AI to further their scams. That's with the case of any new tech that pops up. This however, doesn't make LLMs not interesting.