Data Privacy Week: What Apple’s Wiretap Settlement Means for the Future of Privacy

Written by

In January 2025, Apple agreed to pay $95m to settle a class action lawsuit alleging the smartphone behemoth violated wiretap and privacy laws.

The lawsuit claimed that the Siri assistant feature surreptitiously recorded conversations through iPhones and other Apple devices equipped with the virtual assistant for more than a decade, even when people didn't seek to activate Siri. This information was then allegedly shared with third parties for advertising purposes.

This is not the first time a technology company has been accused of or been found to have engaged in eavesdropping on conversations and used that data for commercial purposes. It is not even the first time Apple has been accused of recording conversations using Siri without a person’s knowledge.

While the jaded among us may be resigned to companies selling our personal data, Apple, a company whose entire brand was built on privacy as a fundamental human right, was presumed to be above these nefarious practices.

How did a company willing to incur the wrath of Bill Barr by refusing to unlock cell phones of suspected terrorists become accused of recording private conversations for ad dollars? 

What Apple’s Case Means for Consumer Privacy

Apple is not admitting to any wrongdoing and denies selling Siri-sourced data for any purpose. However, whether Apple is legally liable or engaged in the alleged practice is not the crux of the issue.

The more pressing concern is what it means for consumer privacy when a company’s brand and market saturation can become so indelible that failing to commit to its core and primary tenet – caring about your privacy – is not a brand dilution event.

Whether we realize it or not, businesses give a lot of weight to how their reputation will be impacted by any one product decision and whether that reputational impact will have negative or lasting financial consequences. This requires having control of your brand’s quality and distinctiveness and how the brand is viewed.

When a brand loses the ability to do that, that concept is referred to as brand dilution. A brand risks dilution in three main ways:

  • Stretching capacity too thin, i.e., losing control over quality
  • Introducing unrelated services or products
  • Losing control of the brand, i.e., having negative actions accurately or inaccurately attributed to you

Companies devote entire departments and budgets to prevent any of these events, and yet, what we’re witnessing with this Apple settlement is a product strategy that risks at least two if not all three of these occurring.

But will it? Is Apple so ubiquitous and synonymous with privacy after decades of (presumed) brand consistency that it no longer needs to nurture those concepts to reap the financial and market benefits? If that is true, what does that mean for consumer privacy going forward?

Apple has been shown to have exceedingly high brand loyalty, a large part of which is built on its commitment to consumer privacy, and has an almost 60% market share of cell phones. Apple has so far escaped the scrutiny of its rivals, perhaps partly because it purveys a strong message of privacy. It had a dig at its competitors earlier this year with a billboard that read: “What happens on your iPhone, stays on your iPhone.”

The issue becomes that, realistically, even for those choosing Apple products for their privacy features, this settlement is unlikely to make a dent in the form of brand abandonment. This would require millions paying for a new phone, often tied to their existing phone plan package, moving all of their data over or replacing (expensive) devices that may not have a comparable competitor yet. 

"We should not normalize any company getting so big that their promises to prioritize privacy no longer matter"

Moreover, because change is difficult and costly, and the association is so ingrained, this settlement may not be associated with Apple as a brand for longer than a news cycle (if their users even hear about it). If there are no meaningful or long-term financial or branding repercussions, will each product or feature be evaluated as strongly for harm reduction above potential profit or business need?

How to Ensure Companies Prioritize Privacy Protections

Privacy harm reduction is essential in a time where location surveillance or health trackers can lead to real consequences, like criminal prosecution or discrimination based on profiles created through generative AI models.

This is particularly concerning in light of Apple’s intention to move forward on its plan to leverage OpenAI on iPhones and other Apple devices. For most users, these devices contain private, sensitive, personal and potentially embarrassing or harmful information.

There is a current lack of over-arching regulation over the use of AI to collect and process personal and sensitive data, and in the US, no comprehensive federal privacy law. Combined with the all-encompassing ability of Siri-enabled devices to interface with every aspect of a consumer’s life (e.g., providing access to personal medical or identity-related information), the storage – unnecessary to the functionality of Siri-enabled devices – there is a risk from hacking or other unauthorized leveraging of consumer data.

While Apple stated that only 1% of recordings are used to improve its responses to user requests and to measure when the device is activated accidentally, this does not alleviate the concern raised, because even 1% reflects a significant number of impacted consumers when there are 500 million Siri enabled devices in use.

Moreover, even in rare cases when damages are awarded, each class member collects a few measly dollars. This instance is no different. Court documents indicate that if approved, each consumer could receive up to $20 per Siri-equipped device covered by the settlement – a payment that could be reduced or increased, depending on claim volume.

I can hear you now. Isn’t $95m enough to ensure Apple continues to invest in privacy? Maybe. However, because of the number of users and the damages provisions in the statutes the lawsuit was brought under, not to mention the legal costs of litigating the same, Apple could have been found liable for billions if the lawsuit went to a jury trial.

In light of Apple’s profits, $95m seems like a discount, especially if there are no further (potential) reputational harms from a public trial. This also does not address the privacy harms of non-Apple customers in those recordings.

Where do we go from here? As privacy professionals, we should continue to advocate for and implement privacy by design, particularly in the absence of regulations specifically targeting AI or the personal data of individuals not covered by existing legislation. As consumers, we have to hold companies accountable by being vocal about the changes we want to see and making meaningful purchasing decisions whenever we can.

This settlement should not be buried in an unyielding news cycle, but treated as bad for their reputation for being so contrary to their brand. We should not normalize any company getting so big that their promises to prioritize privacy no longer matter.

The opinions expressed in this article are those of the author, and not her employer, or any affiliated firms, investors, clients or others

Image credit: Koshiro K / Shutterstock.com

What’s hot on Infosecurity Magazine?