Meta is making headlines again for its opaque data practices. This time the social media giant is under fire for how it has handled personalized advertising – unlawfully forcing users to accept personalized advertising under its sweeping terms and conditions. In effect, Meta has been telling its users: If you want control over how your data is used and handled by us, you can’t use our services. Meta tried their best to explain in this blog post.
The regulators aren’t happy about that, and rightly so. Ireland’s Data Protection Commission (DPC) has ruled that Meta’s lack of transparency about how it collects and handles its users’ data breaches EU privacy rules, and the company has been fined almost €400m ($413m) for doing so. In response, Meta is changing how it processes information for behavioral advertising. Whether these changes actually represent a meaningful move towards fully transparent data practices is yet to be seen.
This is not the first time a large corporation has been caught out using unpalatable data practices, and it surely won’t be the last. Regulators are coming after big businesses that have played fast and loose with their customers’ data for too long. BetterHelp is in trouble for sharing customer data with social media companies despite promising this data was private. In France, TikTok has been fined €5m ($5.4m) for its approach to cookie handling, which falls foul of European data protection laws.
More regulations and laws dictating the rules around corporate data practices are coming into effect: California and Virginia’s own version of GDPR went live on Jan 1 2023. Colorado, Connecticut and Utah will begin enforcing their statutes this year, and Iowa will join them in January 2025.
An Inflection Point for Marketing
However, we shouldn’t lose sight of what’s important here. This is not just a story of a social media giant falling foul of new rules around dodgy data practices. Instead, this should inspire a discussion about how brands interact with advertising, marketing and data collection. These stories should make brands realize that the best, and perhaps the only way, to build meaningful, equitable relationships with their consumers is by respecting how they collect, use, store, share and even retire their data. Could this be an inflection point for Meta? Will the company make a renewed commitment to robust data protection, setting an example for other social media companies and global brands when it comes to data, privacy and consumer trust?
Perhaps, but that may not be popular with the company’s executives and advertising chiefs. These stakeholders will be reluctant to part ways with one of the business’s golden carrots: the wealth of data held on its billions of users across the globe that can be waved in front of hungry brands and advertisers. Without this, a key pillar of Meta’s business model could be badly affected.
The best-case scenario following the DPC ruling is for Meta to take the route of full transparency. The company should go back to basics and build a set of data controls with trust at the core from the outset. In practice, this would look like a complete, thorough and well-articulated set of data collection purposes defined clearly for each and every channel and interaction within the Meta ecosystem. This would be a dramatic turnaround for Meta, but this approach to data collection and handling is exactly where the market is headed. In fact, even Meta’s leadership team is making the right noises about brands’ collective responsibility to their consumers regarding data.
Unfortunately, every corporation has a different idea of what transparency actually means. If, say, Meta decides that transparency means having to unveil more of its historic bad data practices, then the business may try to find workarounds to full transparency.
Consumers are put off by brands when it’s revealed that their data has been exploited or used illegally, and those consumers may decide that they’ll point-blank reject the continued use of their data by Meta, even if the company cleans up its practices.
These concerns aside, Meta really does have an opportunity here to step out of the shadows and build a data-driven relationship with their users that could prove to be a long-term boon for the company (as well as for consumers everywhere).
With the market slowly changing in favor of a privacy-first approach to data practices, advertisers are coming around to the idea that an opt-out advertising policy can ultimately improve the quality of the individuals they’re targeting.
For too long, far too much purchase has been given to the quantity of data collected rather than the quality of the data. If you’re an advertiser and have been granted access to a highly-specific piece of information shared by your core target market, you can begin to build well-targeted campaigns. This greater quality data can also be instrumental in building those prized life-long relationships with consumers.
This ruling may also level the playing field for brands – especially those with large followers and consumer bases. They will now all have to compete in the ads-driven market with rich, first-party data directly from their customers, who will have given explicit permission to exchange their personal insights for ad revenue. A more equitable market for brands will necessarily mean greater privacy and protection for consumers.
As this latest ruling demonstrates, it’s not good enough anymore to bury sweeping data privacy statements in your terms and conditions, taking away your customer’s control over their data and information.
Trust has to be the fabric of any enterprise data strategy, and the business leaders who take that on board sooner rather than later will realize the gains. The corporations who continue to disregard the wishes of both consumers and regulators alike when it comes to data practices will be left behind, wondering why their ad revenue and loyal customer base have abandoned them for their competitors.
Image credit: rafapress / Shutterstock.com