At its annual Worldwide Developers Conference (WWDC) on June 5th, 2024, Apple revealed its new artificial intelligence system called “Apple Intelligence.” This AI upgrade will add smart features to the iPhone, iPad, and Mac operating systems powered by Apple’s own technology combined with AI tools from OpenAI.
Apple’s AI appears, at least in its initial release versions, more limited than Microsoft’s extensive, system-wide implementations
The main abilities of Apple Intelligence will include using AI to summarize information, suggest replies to messages, and give Siri more control over apps. However, Apple is focusing on AI that has wide, mainstream appeal rather than cutting-edge innovations like generating images or videos with AI.
To protect user privacy while delivering powerful AI, Apple says it will use a special algorithm. This will automatically decide whether to process AI tasks locally on the device itself, or send it to Apple’s cloud servers. Any data processed remotely will use advanced security like Apple’s M2 chips and Secure Enclaves to be just as private as on-device.
Bloomberg also reports that Apple will not create profiles of users based on their data. Apple will provide transparency reports proving it does not sell or even view individuals’ information.
The most advanced Apple Intelligence features may require an iPad or Mac with an M1 chip or newer. On iPhone, it could be limited to the iPhone 15 Pro models or upcoming iPhone 16 releasing later in 2024. Importantly, users will be able to opt-in to use these AI capabilities rather than having them forced.
By making AI an opt-in choice and restricting how much data processing happens in the cloud, Apple seems to be positioning itself as a privacy-focused alternative to Microsoft’s more invasive AI approach. While not as locked-down as open source Linux, Apple’s carefully regulated AI may appeal to those concerned about corporate overreach into personal data.
Apple Intelligence Choosing Which of Your Data to Send to the Cloud Is Not Great for Privacy
While Apple claims its algorithm will securely handle what data gets processed locally versus in the cloud, having that decision made by an opaque system raises privacy concerns. Users may not have full visibility into why certain tasks are deemed safe for on-device AI while others require sending personal information to Apple’s servers. This “black box” approach could undermine trust in how private their data really is.
The best path to privacy and Digital Freedom is to switch use computers that run Linux and FOSS software
For those who truly value digital privacy and freedom, the best path is to use computers running open source Linux distributions and free/open source software (FOSS). On Linux and with FOSS programs like LibreOffice, there is complete transparency about what code is running on your machine and what data it accesses. You maintain full control instead of having to trust assurances from tech giants like Apple or Microsoft. While perhaps less convenient than mainstream platforms, embracing open source computing is the strongest way to protect your digital rights and civil liberties in an increasingly AI-driven world.