When we hosted a data privacy webinar with BDO earlier this year, the conversation focused on the patchwork nature of US privacy law and the importance of aligning promises with actual practice.
Now, as we near the end of 2025, that patchwork has only become more intricate.
From California’s next wave of CCPA updates to Maryland’s stricter state law—and with an Executive Order reshaping how US data moves globally—organizations are under more pressure than ever to modernize their compliance strategies and strengthen their data governance programs.
In part two of the webinar series, Navigating Data Privacy Trends in 2025, Relativity’s vp and general counsel, Beth Kallet-Neuman, and Taryn Crane, privacy & data protection practice leader at BDO, unpacked the latest developments and shared actionable insights for compliance teams.
The Executive Order: A New Lens on Familiar Principles
One of the most significant developments of the year has been Executive Order 14117, better known as the DOJ Bulk Data Transfer Rule.
“The concepts aren’t necessarily new,” Taryn says, “but the development is unique because now we’re looking at it from the perspective of US data exiting and going into countries of concern.”
Unlike most privacy laws that regulate inbound data, this Executive Order shifts the focus outward—requiring organizations to assess how and where US data flows internationally.
To prepare, Taryn offered a few tips to organizations:
- Update data flow maps and inventories: Know where sensitive data lives and where it’s going.
- Revisit third-party contracts and conduct risk assessments: Understand your exposure from a vendor perspective so that your service providers are all held accountable to the requirements.
- Address security deficiencies compared to CISA requirements: Some of Taryn’s clients have even performed mock CISA audits to help prepare.
“The risks are getting higher to not have your data mapping in order,” Beth adds. “If you’re going to try for low-hanging fruit, figure out where you’re potentially sending data that might be in violation of this EO. If you haven’t done the work yet, now is a good time to start.”
The Patchwork Gets Patchier: Meeting State Privacy Laws
With 18 states now having their own privacy laws—and more amending existing ones—the US landscape remains as fragmented as ever. Beth highlighted Maryland as one of the biggest departures from the norm, noting its stricter thresholds and broader definitions of biometrics and sensitive personal data, such as the inclusion of geo-targeted data. She gave several examples of the state’s heightened requirements, including:
- Processing limitations: Controllers may only process sensitive data where it’s “strictly necessary” to perform the service requested by the consumer.
- Enhanced service restrictions: Controllers face tighter rules when providing enhanced services to existing customers if those enhancements rely on sensitive data that is not “strictly necessary.”
- Sales prohibition: Controllers cannot sell sensitive personal data—even with consumers’ consent—unless it’s necessary to provide the product/service to the consumer and the consumer directs the controller to do so.
- Data minimization: Controllers may collect personal data only if it is reasonably “necessary and proportionate” to provide or maintain the specific product or service requested by the consumer.
The point? Maryland’s approach has effectively raised the national bar.
“Rooting down in those principles helps solves most of the issue. Maryland is an example of why you need to roll up your sleeves, figure out what the changes are, and update your program accordingly,” Taryn says.
“The gold standard will adjust,” Beth adds. “It’s a matter of how much and how ready you are. Readiness is key.”
If you needed more convincing, remember that the stakes are very real. Enforcement action in Texas and California demonstrate that regulators are fully engaged. “Don’t mess with Texas applies to consumer privacy now too,” Beth says. The Texas Attorney General’s office has filed its first claim under the state’s privacy law and recently secured the largest state settlement to date with Google.
“We’re seeing a lot of arrows in the quiver,” Beth says. “The AGs are enforcing consumer rights when it comes to what they consider to be the excessive collection of data and excessive collection of sensitive data like geolocation data and biometrics data. We’re going to keep seeing that.”
California Continues to Lead the Way
On the opposite coast, California continues to raise expectations with new updates to the California Consumer Privacy Act (CCPA)—many of which align with GDPR-style requirements. Beth shared three key changes:
- Cybersecurity audit requirements: Businesses that hit revenue and processing thresholds are required to undergo annual independent security audits if their processing activities pose “significant risks” to consumers’ security.
- Risk assessment requirements: Similarly, businesses that hit revenue and processing thresholds are required to conduct detailed risk assessments before activities that present “significant risks” to consumers’ privacy.
- Regulation of automated decision-making technology (ADMT): If ADMT replaces or substantially augments the human role in “significant” decisions, consumers must be given notice, the opportunity to opt out, and the right to access information about the ADMT’s use and logic.
Taryn offered a few practical takeaways:
- Review and update privacy notices to ensure transparency.
- Conduct a readiness assessment for cybersecurity audits before they become mandatory.
- Inventory your automated decision-making tools so you can prepare for future notice and consent requirements.
“If you’re already doing these things across the board,” she said, “you’re in a much better position regardless of jurisdiction.”
AI as a Business Driver: Making Compliance Cool
The conversation then turned to AI, which is reshaping not just technology but organizational roles in privacy and compliance. AI has become a catalyst for better data hygiene, as organizations revisit data classification, retention, and sharing practices. For compliance professionals, this presents a moment to lead.
“Be part of the AI strategy,” Taryn urged. “Give yourself a seat at the table. It’s an opportunity to push data governance forward and make compliance cool.”
To ensure the business is adopting AI safely, Beth and Taryn offered a few tips:
- Use synthetic data sets for testing AI platforms.
- Have a clear data classification policy that defines what can and cannot be used in AI. “It can’t be guesswork,” Beth said. “We need to make it easy for employees to comply.”
- Conduct shadow AI assessments. These don’t just assess risk—they can influence AI strategy. “If you see people going to the same sites and using the same tools, there may be an opportunity to enable and give your business the tools they need, but doing so within the bounds of your organization,” Taryn said.
Another growing concern is third-party vendors adding AI features post-approval—sometimes without notifying customers. To address this, Beth offered two practical approaches:
- Classify and bucket data. Dig into what data you’re providing and apply a low-medium-high risk model, assigning risk assessments accordingly.
- Assume data sensitivity and work backward. Start with the assumption that the data is sensitive. Then align with frameworks like NIST or the EU AI Act to ensure vendors meet requirements for high-risk data upfront. You won’t need to go back and forth with the business multiple times—your vendors will already meet the bar.
At the end of the day, the data regulatory landscape may feel like a cross-country puzzle, but organizations that invest in readiness, transparency, and partnership will find a way to solve it—while enabling their teams at the same time.
Graphics for this article were created by Kael Rose.






