Apple has added a new feature to iOS 18.2 aimed at protecting the morality and mental health of minors. Having detected a nude image, the system adds blur to it, displays a warning and requires confirmation to continue working with these materials – you must enter a password. At the same time, the new function does not encroach on end-to-end encryption and does not imply backdoors for authorities
The function is based on machine learning algorithms. When it detects nudity, it automatically adds a blur effect, displays a message about the possible sensitive nature of the content, and offers options for further actions: leaving personal or group correspondence, blocking the interlocutor, or accessing online security resources. There is also the option to send an alert to a parent or guardian.
On iPhone and iPad, the feature analyzes images in Messages, AirDrop, contact posters, FaceTime video messages and “certain third-party apps” with the ability to share photos or videos. Similar capabilities will appear on PCs, smartwatches and the Vision Pro headset – you will need to install iOS 18, iPadOS 18, macOS 15 Sequoia or visionOS 2. The function debuted in Australia, where the authorities intend to adopt regulations obliging tech giants to control materials related to terrorism and violent treatment of children – “where technically feasible.”
In 2021, Apple wanted to roll out a system on phones that would locally analyze all content for such material and automatically notify authorities of possible incidents, but there were concerns that this would violate user privacy and that authoritarian regimes would try to abuse the feature. A year later, the company abandoned this initiative.