From the rise of AI to new state and federal legislation, big changes are coming for the privacy industry — but amid all the think pieces and regulatory updates, it’s often hard to find actionable advice and practical guidance.
Here are three important steps that practitioners should take to more effectively face the upcoming changes around privacy issues:
Understand data privacy impact assessment (DPIAs)
DPIAs are daunting, but they’re really a way of telling a story about the organization’s data use. DPIAs are more than just regulatory requirements — they are an opportunity to develop a comprehensive overview of the organization’s risk exposure, spanning data practices, privacy governance, cross-team and cross-disciplinary engagement, and diligence processes with partners and vendors. To manage DPIA obligations effectively, here’s what privacy leaders need to know:
- Where: As of this year, DPIAs are required in Virginia, Colorado, and Connecticut — plus California, although they haven’t yet finalized their regulations. With privacy laws in Montana, Tennessee, Texas, and Indiana also requiring DPIAs, it’s a fair bet that most other states will go along with the trend, imposing DPIA obligations on most U.S. companies.
- When: Common DPIA triggers include targeted advertising and the sale of personal data, and also profiling that involves sensitive data or carries a risk of injury or disparate treatment. Rules on DPIA submission and review vary from state to state: California requires proactive submission of DPIAs on a “regular basis,” while other states take an on-demand approach with attorneys general empowered to demand DPIAs from businesses.
- What: Requirements vary by jurisdiction, but generally include a good-faith effort to articulate the risks and benefits of data processing. Colorado requires a “genuine, thoughtful” analysis. Organizations should clearly state their processing activity, list specific risks and mitigation efforts, including technical measures and training, and communicate the benefits to individuals. It’s also important to document the data types involved and the number and types of individuals affected, plus the technologies used and the underlying purpose of the processing activity.
- How: As DPIAs grow more common, there’s a risk of fatigue setting in unless the company puts efficient systems in place to reduce the burden. Fortunately, a single DPIA can cover a number of materially similar data-processing activities, as long as it’s updated if processing activities change, or the risk profile evolves over time. Bear in mind, that states are still figuring out how to account for sensitivities such as trade secrets and attorney-client privilege, so organizations should stay cautious as they build out DPIA processes.
Get the real dirt on “clean rooms”
Clean rooms are an important part of the new privacy landscape, but they’re also poorly understood. That’s a dangerous combination because it fuels marketing spin, potentially leading organizations to make costly mistakes or poor investment choices. Here are some points to consider:
- What clean rooms are: A clean room is a framework for collaborative data analysis in a controlled environment, with restrictions on how users can view and export data, and a range of tools to support privacy and data protection. Using a clean room, it’s possible to collaborate with partners while still ensuring the privacy and security of first-party data, bolstering consumer trust and helping to secure consent for continuing data use.
- What clean rooms aren’t: Clean rooms are powerful, but they aren’t magical places where ordinary privacy laws don’t apply. The legal requirements differ based on the use-cases and processing involved, but they never go away entirely. It’s better to think of a clean room as a way of getting work done while still complying with relevant laws and regulations — not a way of sidestepping the statute book altogether.
- What’s still TBD: Third-party cookie deprecation and the rise of new privacy laws are clarifying the need for products that allow for continued collaboration around permissioned first-party data. But while there’s a clear need, questions remain: we’re still waiting for an industry consensus to emerge around the way that clean rooms should operate. Data-processing standards, canonical use-cases, and regulatory perspectives are all still evolving.
- What’s happening under the hood: There are different kinds of clean rooms, from pure-play clean room providers to data warehouses and walled gardens run by big tech and media companies. It’s important to understand the types of clean rooms, and the kinds of infrastructure and privacy-enhancing technologies it uses to protect company data. If a security team doesn’t fully understand what goes in, how it’s acted upon, and what comes out, then it won’t know enough about how a clean room works.
- What regulators watch for: The specific regulations impacting clean-room use will vary by jurisdiction and use-case. When reviewing privacy and compliance programs, it’s important to consider which direction the data flows, what types of data get used, and what limitations are being imposed on the way data can be acted on and extracted from the clean room.
Build guardrails for generative AI
Artificial intelligence has become a top concern for many privacy leaders. When it comes to AI, running an effective privacy strategy means staying focused on real-world consequences. There aren’t yet many specific regulations covering data privacy and AI, so regulators will use consumer protection laws banning unfair and deceptive practices to police AI data privacy. That means organizations will be held accountable not just for what’s going on inside their algorithms, but also for the inputs flowing into them, and the outputs they generate. To stay safe, there are a few guardrails organizations should put in place:
- Track inputs and outputs: If personal information is fed into the company’s AI algorithms, the privacy team needs to know about it and ensure the inputs are properly permissioned for the specific use-case in question. It’s also important to monitor the sensitivity of the data that’s used, and the degree to which such data persists in algorithms or their outputs. As a practical matter, it’s often easier for privacy teams to pre-approve some specific use-cases while developing policies and monitoring systems to manage other applications of AI technologies.
- Develop a strategy for internal use-cases: While it’s generally less challenging to use AI internally, the organization needs to make sure that “internal” really means internal. Deploy clear frameworks for sourcing training data and ensuring that no personal or confidential information gets used in inappropriate ways, even for internal purposes. Remember, it’s very easy for internal tools and data to seep into external applications, creating serious privacy vulnerabilities.
- Develop a strategy for external use-cases: The stakes are undoubtedly higher when working with commercial or public-facing AI technologies, so conduct risk assessments and put mitigation strategies in place. With third-party solutions, make sure the team does enough to vet and monitor the incoming data streams. Remember, the goal isn’t just compliance with regulations — it’s creating systems that customers can trust, both to protect the organization’s brand reputation and to ensure a continuing stream of permissioned data for the company’s algorithms.
Find a path to privacy maturity
With new DPIA requirements, the rise of clean rooms, and the emergence of AI, privacy professionals will have their work cut out for them in the coming months. To succeed, they need to build mature privacy programs capable of adapting to emerging challenges without stifling innovation, jeopardizing consumer data dignity, or overly complicating day-to-day business operations.
Privacy professionals need to take a pragmatic approach to meeting their organization’s evolving needs. Privacy leaders can’t simply focus on data privacy, risk mitigation, or even securing customer trust. We need to do all three, while also focusing on leveraging new technologies and building cross-functional operational efficiencies to deliver the pragmatic and effective privacy programs our organizations need.
Jonathan Joseph, head of solutions, Ketch