this post was submitted on 06 Sep 2023
119 points (92.2% liked)
Technology
59322 readers
5250 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I do wonder why Cortana, Siri, Alexa, and Google Assistant are lagging so behind these LLMs. I personally don't use them with any frequency other than setting timers, but it's annoying to even consider using them and then realizing they are not as nearly as usable or helpful as ChatGPT.
The big thing that’s holding Apple back regarding Siri is that they aim to have all their AI-driven functions processed on the user’s hardware, for security/privacy. So they not only need the software component, they want to have the hardware capable of running it inside the individual phones.
eh... sounds like privacy theater to me. Only the audio transcription may be processed on the device.
src
They might aim to have a full blown LLM on the device, but it'll never be as good as the others with these limitations.
Many teams are currently working on striking the right balance of fine tuning and model size. Most aren't considering phones yet, but PCs off network.
It is entirely possible to have an LLM run "closed loop", but obviously Google and Apple want in that loop
For "impressive" general reasoning and conversation these LLM currently require pretty beefy hardware. You're either lugging a GPU around or calling to an API.
Aren't these current personal assistants already relying on API calls for their responses?
Like siri? Yes, my point pertained to hardware needed for LLM specifically though
I know Apple’s developing their own LLM which will hopefully be used in Siri. There’s no guarantee, but I can’t think it would be too hard to add Bard into Google Assistant. Cortana on the other hand was canceled by Microsoft and is being replaced by Bing chat. I believe Amazon is also stopping the Alexa development
They're working on it, but it takes time. Especially making it reliable.
The current crop of llm's will happily answer or do nonsense or even dangerous things.