Apple has agreed to pay $95 million to settle a class-action lawsuit accusing its Siri voice assistant of unauthorized recording and sharing users’ private conversations with advertisers. Mobile device owners said Apple routinely recorded their private conversations after accidentally activating Siri, and then shared that data with third parties, including advertisers.
According to Reuters, Apple has reached a preliminary settlement in a privacy case in which users could receive up to $20 for each Siri device they use.
Voice assistants are usually activated using hot phrases like “Hey Siri.” However, the plaintiffs pointed out that even casual mentions of certain brands or topics subsequently triggered targeted advertising. For example, two complainants reported that after discussing Air Jordan sneakers and Olive Garden restaurants, they were approached with advertisements for those products. Another plaintiff said he received targeted medical advertising after a conversation with a doctor that he believed was completely private. The lawsuit covers the period that begins on September 17, 2014 and ends on December 31, 2024, when the “Hey Siri” feature was introduced into Siri.
The class action is estimated to include tens of millions of people who could receive up to $20 for each Siri-enabled device, such as the iPhone and Apple Watch. Apple, in turn, denies any wrongdoing, but agreed to a settlement to avoid further litigation. Apple representatives and their lawyers did not immediately respond to requests for comment.
The plaintiffs’ lawyers also had no comment, but they are expected to seek up to $28.5 million in fees and $1.1 million in costs from the settlement fund. It is noteworthy that the $95 million allocated for the settlement represents only nine hours of Apple’s profits.
It’s worth noting that a similar lawsuit on behalf of users of Google’s voice assistant is pending in court, and the plaintiffs in both cases are represented by the same law firms. The case against Apple, filed under case number 19-04577, could set an important precedent on issues of privacy and the use of personal data by voice assistants.