346
this post was submitted on 25 Mar 2024
346 points (98.3% liked)
Technology
59217 readers
3414 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Almost every time I ask a direct question, the two AI answers almost always directly contradict each other. Yesterday I asked if vinegar cuts grease. I received explanations for both why its an excellent grease cutter, and why it doesn't because it's an acid.
I think this will be a major issue with AI. Just because it was trained on a huge wealth of knowledge doesn't mean that it was trained on correct knowledge.
I don't see any reason being trained on writing informed by correct knowledge would cause it to be correct frequently. unless you're expecting it to just verbatim lift sentences from training data