this post was submitted on 13 Jun 2024
156 points (98.1% liked)
Technology
59398 readers
2765 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So you would rather submit your non-anonymized data? Because those bastards will find a way to unanonimize it. Is Apple doing the right thing or not?
What? No. I would rather use my own local LLM where the data never leaves my device. And if I had to submit anything to ChatGPT I would want it anonymized as much as possible.
Is Apple doing the right thing? Hard to say, any answer here will just be an opinion. There are pros and cons to this decision and that's up to the end user to decide if the benefits of using ChatGPT are worth the cost of their data. I can see some useful use cases for this tech, and I don't blame Apple for wanting to strike while the iron is hot.
There's not much you can really do to strip out identifying data from prompts/requests made to ChatGPT. Any anonymization of that part of the data is on OpenAI to handle.
Apple can obfuscate which user is asking for what as well as specific location data, but if I'm using the LLM and I tell it to write up a report while including my full name in my prompt/request... that's all going directly into OpenAIs servers and logs which they can eventually use to help refine/retrain their model at some point.
Do you have proof they’re sending it to OpenAI?
I believe I heard it’s done on device or on iCloud servers then deleted.
I mean, that’s the claim at least
https://security.apple.com/blog/private-cloud-compute
IIRC they demonstrated an interaction with Siri where it asks the user for consent before enriching the data through chatgpt. So yeah, that seems to mean your data is sent out (if you consent).