Apple invited everyone to study the reliability of its Private Cloud Compute (PCC) system, which is designed for resource-intensive tasks associated with Apple Intelligence queries. The company is ready to pay up to $1 million for vulnerabilities discovered in it.
Apple has repeatedly emphasized that many of the artificial intelligence features in Apple Intelligence work locally, without leaving your computer, iPhone, or any other device. But more complex requests are still sent to PCC servers, built on Apple Silicon chips and running the new OS. Many AI application developers offer query processing in the cloud infrastructure, but do not provide users with the ability to evaluate how secure these cloud operations are. Apple, which has traditionally declared user privacy a priority, called on independent experts to independently verify the security of its systems.
To achieve this, the company proposed:
- Safety manual with technical details of PCC operation;
- «Virtual Research Environment, which allows you to conduct PCC analysis on a PC – you will need a Mac on an Apple Silicon chip with 16 GB of memory and the latest macOS Sequoia 15.1 Developer Preview;
- Source code published on GitHub for “certain key PCC components that help implement security and privacy requirements.”
As part of the bug bounty program, Apple offers awards from $50 thousand to $1 million for errors found in its products. iOS 18.1 is expected to be released next week, which will debut the first Apple Intelligence features. The beta version of iOS 18.2 for developers was released yesterday, adding Genmoji and integration with ChatGPT.