this post was submitted on 01 Jun 2024
1615 points (98.6% liked)

Technology

59378 readers
3721 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 5 months ago

I mean, it does learn, it just lacks reasoning, common sense or rationality.
What it learns is what words should come next, with a very complex a nuanced way if deciding that can very plausibly mimic the things that it lacks, since the best sequence of next-words is very often coincidentally reasoned, rational or demonstrating common sense. Sometimes it's just lies that fit with the form of a good answer though.

I've seen some people work on using it the right way, and it actually makes sense. It's good at understanding what people are saying, and what type of response would fit best. So you let it decide that, and give it the ability to direct people to the information they're looking for, without actually trying to reason about anything. It doesn't know what your monthly sales average is, but it does know that a chart of data from the sales system filtered to your user, specific product and time range is a good response in this situation.

The only issue for Google insisting on jamming it into the search results is that their entire product was already just providing pointers to the "right" data.

What they should have done was left the "information summary" stuff to their role as "quick fact" lookup and only let it look at Wikipedia and curated lists of trusted sources (mayo clinic, CDC, national Park service, etc), and then given it the ability to ask clarifying questions about searches, like "are you looking for product recalls, or recall as a product feature?" which would then disambiguate the query.