Data & Privacy Abortion Data

With data and privacy in a post-Roe v Wade world, intent and impact are different beasts

By Arielle Garcia, Chief privacy officer

June 30, 2022 | 9 min read

In light of the US Supreme Court’s historic decision to overturn Roe v Wade – the landmark 1973 decision that legalized abortion across the country – the ad industry is inheriting a new set of data privacy responsibilities, writes UM Worldwide chief privacy officer Arielle Garcia.

Two bulls locking horns and fighting

Marketers must take the bull by the horns when it comes to data and privacy / Adobe Stock

At the International Association of Privacy Professionals’ Global Privacy Summit in April, a few months after the introduction of a US bill aiming to ban “surveillance advertising,” US Federal Trade Commission chair Lina Khan asserted that notice and choice for data privacy is “outdated and insufficient” to address current market realities. She suggested that it’s time to consider “substantive limits rather than just procedural protections” that circumvent “fundamental questions about whether certain types of data collection and processing should be permitted in the first place.”

There is no more poignant example of the spirit of the latter assertion than the aftermath of the US Supreme Court’s decision to overturn Roe v Wade. As Cannes Lions attendees celebrated a new partnership on mitigating bias to drive equity and inclusion in advertising technology, the US Supreme Court revoked federal protections for a previously-held Constitutional right – a woman’s right to choose – a decision that has direct consequences not only for women but for the data-driven advertising industry, privacy, data ethics and the future of privacy regulation at large.

Unpacking the tensions between intent and impact

We cannot ignore the reality of the role that the advertising industry plays in the collection, creation and proliferation of the type of data that enables tracking and profiling beyond the context of marketing. We cannot frame the issue as one of “intent” and “reasonable use” at the expense of considering impact. Sensitive user data, in the hands of those who intend to use it in ways that contradict the reasonable expectations of users or the corporate values of the data collection, risks harming people in a way that undermines civil liberties and could contravene human rights.

Apple, Google and others spoke about privacy as a core component of their environmental, social and governance commitments at the RSA Conference in San Francisco earlier this month. Delivering on these commitments requires an honest look at the real-world impact of sensitive data collection and use. It is not enough to contemplate responsible collection and use of data as intended by our industry. It is incumbent upon all of us to consider the collection, use and sale of data in the context of the impact it can have on individuals and society, on rights, on freedoms and on lives – especially when it is used in ways other than intended.

Now that there is real risk that precise location data and search history will be purchased or obtained via legal request to investigate and prosecute those seeking or aiding in what are now, in some states, criminalized abortions, we must face a critical question: is some data inherently too sensitive, too high-risk to create or collect? And where it is collected to provide a service – as is the case with menstrual cycle tracking apps – how can we mitigate the risk of its use, sale or subpoena outside of its intended purpose?

While it is fair to point to the need for federal privacy law to codify protections, it does not change today’s reality. Nor does it address the reality that incidentally engaging in or contributing to the market for such data sits at stark odds with stated corporate values – with efforts to protect access to care and to promote equity, inclusion and safety. Marketers and brands need to ask themselves: are they facilitating and endorsing a high-risk data collection practice that stands in contradiction to their public stances on women’s rights and reproductive care?

Collectively, the industry must remain vigilant and be proactive about understanding the potential for data misuse. These risks, of course, are not new (one need only look to the Cambridge Analytica scandal of 2018 for proof).

Upon the leak of the Supreme Court draft opinion, UM proactively surveyed media, data and technology partners to understand their organizations’ plans as it relates to sensitive health information, known and inferred, collected and shared. This included sensitive point-of-interest data as well as sensitive website visits and search history data that could be used in investigation or enforcement of abortion bans, or by anti-abortion activist groups. We took this step not only because of the human rights issues at play, but because there is a fundamental brand safety risk too.

From our research, the following lessons for marketers and brands capture key areas of opportunity for resolving the disconnect between intention and impact.

1. Establish new data standards

For health publishers and app developers, upstream and downstream platform partners including demand-side and supply-side platforms – along with search and social platforms – define sensitive websites, keywords and apps and the standards that should govern data collection from these properties. Apply an approach similar to the “sensitive points of interest” defined by the Network Advertising Initiative (NAI) and the organization’s guidance on health audience segments. For example, establish a standard to prevent or limit data shared through pixels and software development kits from pre-defined ‘sensitive’ properties.

2. Minimize collection of sensitive information

Decrease the collection and retention of sensitive browsing, search and location data for what is necessary for the expected purpose. Use data minimization techniques such as de-identification, encryption and other safeguards. Adopt a policy for response to law enforcement requests, the disclosure of such requests and a policy preventing the sale of such data.

3. Appraise the practices of key business partners

Brands and agencies should ask their partners how they are addressing these issues and look beyond surface considerations such as contracts and tick-the-box compliance to consider the true impact of practices as they relate to sensitive data. In doing so, marketers can learn a lot. As one partner put it, for example, “In the bid stream, we receive geographic location data and IP addresses – we ... pass this data to third-party DSPs indiscriminately for their targeting.”

Select partners that provide substantive responses like the above – rather than pointing to privacy policies and terms that volley responsibility to upstream or downstream partners – as it demonstrates accountability and willingness to improve. Prioritize partners who have implemented or are actively developing policies and controls beyond contractual measures to prevent collection or transmission of sensitive data – even where incidental, such as in the bid stream example above.

An industry poised to adapt

It’s somewhat reassuring that, in the days since the Supreme Court’s draft opinion was leaked, the industry has opened its eyes to the data risks in a post-Roe v Wade world. The NAI last week announced a new voluntary code for precise location information, that while long overdue, focuses on defining and minimizing the collection and monetization of data related to a defined set of “sensitive points of interest.” Commitments made by the code’s three inaugural partners (Cuebiq, Foursquare and Precisely PlaceIQ) should serve as an example for the rest of the adtech ecosystem.

Throughout our outreach at UM, we have seen similarly encouraging engagement, finding that there are certainly some proactive partners in the ecosystem, and many more who are receptive and committed to improving their practices to protect their audiences and brands.

Several partners expressed gratitude for bringing gaps to their attention and providing an impetus for important internal discussions. Others advised that they are developing new policies following the discussions that stemmed from the exercise. For example, one publisher partner has advised that it will be expanding its policy on Sensitive Personal Information to address “sensitive interests” on the basis of our questions.

There remains important work to be done as an industry to protect users and retain their trust that the data they share with us will not be used to cause them harm. As the stakes grow higher, we cannot and should not be caught flat-footed about the inappropriate use of data. Above all, the greatest opportunity to mitigate harm, respect privacy and protect people in a post-Roe v Wade world lies in collective willingness to engage in candid dialogue – to foster awareness, to share challenges, to discuss gaps and to collaborate on standards and solutions that serve the best interests of the people who trust us with their data.

Arielle Garcia is chief privacy officer at UM Worldwide.

Data & Privacy Abortion Data

More from Data & Privacy

View all

Trending

Industry insights

View all
Add your own content +