4 failings of today’s data privacy legislation
Increasingly stringent – but often opaque – privacy laws inhibit both the optimization of the user experience and effective marketing, writes Verve Group exec Aviran Edery.
/ Anthony Garand
Despite the fact that the marketing and adtech industry is built around consumer data and the ongoing quest to deliver targeted, relevant messaging, few leaders in the industry will deny that enhanced protections for consumers’ privacy and personal information are sorely needed. And yet, very few privacy-related bills and initiatives have emerged that the industry can fully stand behind.
For the industry to advance to a state of true maturity, improved and consistent consumer data protections are essential. However, the wrong legislation could prove disastrous – for the industry, as well as for consumers.
Unfortunately, given the complexity of the current data-driven technology landscape, today’s regulatory bodies are ill-prepared to deliver the nuance needed to reliably enhance consumer protections while preserving the value exchange that consumers want – and increasingly demand – within the digital economy.
Many privacy-related initiatives are falling short. Achieving desirable and sustainable progress requires pivoting in a few key ways.
Understanding what is “reasonably necessary” under the law
Several of our legislators understand that data collection, to some extent, is required to keep digital environments functioning as desired. At the same time, they want to ensure companies aren’t extending their data collection protocols beyond what’s “necessary.” Therein lies a challenge.
Consider the proposed American Data Privacy and Protection Act (ADPPA), for example, which many are viewing as a North American version of Europe's far-reaching General Data Protection Act. The legislation specifies that data collection should be limited to what’s “reasonably necessary,” but does little to explicitly define what that phrase means. Other state-level laws include similar language.
This is a big problem when it comes to putting a nice idea into practice. How we define “necessary” data will ultimately determine if the measure has its intended effect or not, but policymakers seem to be leaving important determinations like these up for interpretation.
Softness where decisiveness is needed
Beyond language that requires greater concision, we’re also seeing areas of broad latitude in legislative language where none should be tolerated.
One such area has to do with the privacy of minors. ADPPA, for example, specifies that policies should “consider” mitigating privacy risks related to minors – notably soft language on what should be one of the primary goals of strong privacy legislation.
If legislation doesn’t adequately provide for the protection of the privacy of minors, it’s going to be incumbent upon marketers and publishers themselves to ensure this need is met. We see acknowledgment of the need already among brands like Disney+ and Netflix, which have determined that they won’t show ads or collect data from children on their ad-supported tiers. Such policies should become the norm, even if not mandated under new privacy regulations.
Fragmentation limits effectiveness
The continued – and, in fact, increasing – differences among state, national and international privacy laws represent another tremendous challenge. Already, companies are investing significant resources trying to keep pace with the vastly varied faces of “compliance” from one place to another, and the path forward only looks more onerous.
This continued fragmentation in expectations simply isn’t sustainable, nor is it in the best interests of consumers – who should have a reasonable expectation of certain baseline protections.
Lack of nuanced understanding
Above all else, many regulations suffer from the same shortcoming: a lack of nuanced understanding. Most privacy legislation is well-intentioned, but very few initiatives are likely to accomplish what they set out to do because their authors simply don’t understand the systems that underpin the technology and services they hope to regulate.
Consider the EU's Digital Markets Act, which may force Apple to allow sideloading on its devices (which, in short, allows third parties to install software on a device without using the device's approved app store or software distribution channel). On the surface, the intent is a good one: to encourage a more open ecosystem and healthy competition. But it also opens the door for nefarious activity, including cloned apps and other unintended side effects that Apple works hard to avoid through its current policies and processes.
This is a much bigger issue than people realize, given that side-loaded apps can be bundled with viruses and spyware, and under the new legislation, no one would be vetting the code any longer. The malicious apps in question could range from fake apps – like an app that looks like a legitimate bank's app but isn't – used to phish login info to an actual legitimate app that is decompiled and recompiled with bundled spyware by a third party. Ironically, this only serves to endanger consumers. Allowing app stores to screen the apps loaded onto devices is a way of ensuring that someone vouches for the authenticity and security of the app.
Ultimately, today’s legislation needs to be less prescriptive about the technicalities of how outcomes are achieved and more focused on ensuring the outcomes themselves are prioritized. After all, in the end, legislators and reputable industry players all share the same goal: to create a digital ecosystem that preserves consumer trust and ensures optimal user experiences. Only with such a foundation in place can we all hope to move forward together.
Aviran Edery is senior vice-president and general manager of marketplace at Verve Group.