this post was submitted on 10 Jun 2024
2 points (50.6% liked)
Technology
59424 readers
3919 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You think your iPhone isn’t collecting data on you? Is that what you’re saying?
The phone is, Apple isn’t. They outline everything in the keynote if you are interested.
Their keynotes are irrelevant, their official privacy policies and legal disclosures take precedence over marketing claims or statements made in keynotes or presentations. Apple's privacy policy states that the company collects data necessary to provide and improve its products and services. The OS-level AI would fall under this category, allowing Apple to collect data processed by the AI for improving its functionality and models. Apple's keynotes and marketing materials do not carry legal weight when it comes to their data practices. With the AI system operating at the OS level, it likely has access to a wide range of user data, including text inputs, conversations, and potentially other sensitive information.
Unless you are designing and creating your own chips for processing, networking etc, then privacy today is about trust, not technology. There’s no escaping it. I know iPhone and Apple is collecting data about me. I currently trust them the most on how they use it.
Running FOSS and taking control of your network will do a far better trick of privacy vs convenience than most people can imagine
There are degrees of trust though. You can trust the developers and people who audited the code if you have no skill/desire to audit it yourself, or you can trust just the developers.
And even closed systems' behavior can be monitored and analyzed.
Yes definitely, Apple claimed that their privacy could be independently audited and verified; we will have to wait and see what’s actually behind that info.
How? The only way to truly be able to do that to a 100% verifiable degree is if it were open source, and I highly doubt Apple would do that, especially considering it's OS level integration. At best, they'd probably only have a self-report mechanism which would also likely be proprietary and therefore not verifiable in itself.
They have designed a very extensive solution for Private Cloud Computing: https://security.apple.com/blog/private-cloud-compute/
All I have seen from security persons reviewing this is that it will probably be one of the best solutions of its kind - they basically do almost everything correctly, and extensively so.
They could have provided even more source code and easier ways for third parties to verify their claims, but it is understandable that they didn’t, is the only critique I’ve seen.
As stated above, Private cloud compute has nothing to do with the OS level AI itself. ರ_ರ That's in the cloud not on device.
As stated here, it still has the same issue of not being 100% verifiable, they only publish a few code snippets they deam "security-critical", it doesn't allow us to verify the handling of user data.
Adding to what it says here, if the on device AI is compromised in anyway, be it from an attacker or Apple themselves then PCC is rendered irrelevant regardless if PCC were open source or not.
Additionally, I'll raise the issue that this entire blog is nothing but just that a blog, nothing stated here is legally binding, so any claims of how they handled user data is irrelevant and can easily be dismissed as marketing.
Security and privacy in 2024 is unfortunately about trust, not technology, unless you are able to isolate yourself or design and produce all the chips you use yourself.
Yeah and apple is completely untrustworthy like any other corporation, my point exactly. Idk about you, but I'll stick to what I can verify the security & privacy of for myself, e.g. Ollama, GrapheneOS, Linux, Coreboot, Libreboot/Canoeboot, etc.
Ok, I just don’t see the relevance to this post then. Sure, you’re fine to rant about Apple in any thread you want to, it’s just not particularly relevant to AI, which was the technology in question here.
I hear good things about GrapheneOS but just stay away from it because of all the stranger. I love Olan’s.
We're discussing Apple's implementation of an OS level AI, it's entirely relevant.
GrapheneOS has technical merit and is completely open source, infact many of the security improvements to Android/AOSP are from GrapheneOS.
Who?
Lol thank you autocorrect. Ollama.