Skip to content

Why Tech Leaders are Making Data Privacy and Trust a Priority in 2025

Data privacy is evolving at an unprecedented pace. As an IT leader, staying ahead of regulatory changes, building robust compliance frameworks, and safeguarding customer data are critical to your role.

The following Tech in Motion event discussion will help equip you with the tools, strategies, and insights needed to navigate these complexities in an increasingly connected world.

The conversation included the moderator/host David Shipley, CEO of Beauceron Security, and panelists Jermaine Oldham, VP of Tech Ops and Information Security at Echo Global Logistics; Cristina Bartolacci, Head of Sales Engineering at Thoropass; and Gavin Anthony Grounds, CEO and co-founder of Mercury Risk and Compliance, Inc.

Continue reading for highlights of the conversation, or watch the video below for all of our panelists' full remarks:

David: What role should leadership play in establishing a companywide data privacy-first culture?

Jermaine: Leaders should communicate the importance of data privacy as a core value by setting a strong example and prizing data privacy in all business discussions and decisions because that will signal significance to the entire organization.

The second thing is developing and endorsing policies. Leadership should be actively involved in the development and endorsement of comprehensive data privacy policies. Their commitment to these policies helps ensure that they're taken seriously throughout the organization.

Third, leaders need to allocate adequate resources, including budget tools and personnel, to support data privacy initiatives. This includes investing in necessary technology and training programs.

Fourth, foster a culture of accountability. Leadership should establish clear accountability and responsibilities for data privacy at all levels, encouraging a culture where employees feel responsible for not only protecting data but understanding the consequences of not doing so, and then promoting education where leaders should champion ongoing training and awareness programs to ensure all employees understand data privacy risks and responsibilities, including regularly updated training in response to new threats and regulations.

Finally, leaders should embed data privacy into the overall business strategy rather than treating it as a separate initiative or concern. This means considering privacy implications in every aspect of planning and development. Leaders should oversee regular audits and assessments to ensure compliance with data privacy laws and internal policies and hold departments accountable for privacy-related performance metrics.

David: Of the things you suggested, which do you think leaders struggle with most?

Jermaine: When it comes to leaders and what they struggle with the most, one of the challenges I've had and other leaders I've seen have is incorporating privacy into business strategy. The business is running fast, and they are moving along with their initiatives. Sometimes, it's hard to think retrospectively or in the moment about data privacy.

New call-to-action

David: How do developments in generative AI privacy policies influence the data privacy landscape for businesses?

Christina: I think we're going to see a really big emphasis on how we're actually integrating and using AI and the ethical use of that, and the implementation of it is going to have a big emphasis for a really long time.

It’s not new to Europe, for example, with the implementation of GDPR, or even in the United States, with states like California implementing their version with CCPA. We're starting to see additional states start to adopt it as well.

There's going to be a heavy push informing buyers of how a model is actually being trained and what information is being used to train the model. It is important for them to have a policy on when, where, and how to use AI tools to do their job. As a company, you must have a policy that's widely accepted and is outwardly placed for employees to understand.

David: Can institutions offload privacy compliance burdens by outsourcing this to somebody else and they can manage all these issues of consent and identity management, or is it better to maintain in-house control?

Gavin: It’s incumbent on us that if we are under a particular law, we have to comply with certain citations. This applies not just to privacy, it is anything that's regulated or anything where there are policies and standards that we need to meet. We cannot outsource accountability. What we can do is break it down into specific functions where it might be more operationally efficient and even financially efficient to have a third party do those.

Could we use AI to enhance the performance of some of those processes? We cannot outsource our accountability, but we can take specific functions and see how we can drive efficiency in the way we operate those sub-functions.

David: What are the key challenges organizations continue to face when navigating global privacy regulations?

Christina: The goalpost has continued to move a little bit, so understanding what the baseline is, what is going to be going forward, and what you need to adhere to so that's been difficult for companies to be able to necessarily meet specific criteria.

GDPR has already been in place for a long time, and CCPA as well, but there have been individual laws state by state that have been popping up with additional requirements. So, if you're a company, you need to have a solid understanding of what's in front of you from compliance and risk management because they're constantly changing.

Even if you're assigning that task to a specific GRC manager or to a risk professional or security analyst, that's fabulous. How is management assisting you and getting that as a broader management buy-in goal?

Read More: Exploring the Future of Data and AI Trends

David: I have seen this before where a senior executive learns about a regulatory requirement as they're figuring out how they're going to notify various officials about a privacy breach, and I would suggest that's probably the worst time to learn about your regulatory requirements.

Christina: It gets even worse when you start to have a breach or are in violation of the code. We’ve talked about the economic impact, but if you get a fine, these are not small fines. GDPR is one of the ones that crack down the most.

David: What tools or frameworks are essential for helping with that continuous compliance and also the treadmill of evolving data privacy laws?

Jermaine: Top of mind for me is data mapping software that can help organizations map their data flows, identify what data has been collected, how it's been used, where it's stored, and what's crucial for compliance. They can provide functionalities for managing privacy policies, conducting assessments and maintaining records of processing activities.

Risk assessment tools help organizations conduct privacy impact assessments and risk assessments to identify and mitigate risk. Incident management software can assist organizations in managing data breaches and privacy incidents, ensuring compliance and notification requirements. Online training and awareness solutions can provide data privacy training for employees, ensuring they understand their responsibilities and the importance of compliance. Data loss prevention tools can help monitor and protect sensitive data from unauthorized access or breaches.

As far as frameworks, GDPR compliance provides a structured approach to meet the requirements of regulation, including principles of transparency, data minimization and user rights. “N” Cybersecurity framework provides guidance on identifying, protecting, detecting, responding to, and recovering from data privacy incidents. ISO ISAC 27001 is an international standard that outlines best practices for Information Security Management systems and helps organizations establish and maintain data protection protocols. COBIT framework helps organizations manage and govern their IT and data privacy processes effectively. CCPA compliance adopts specific guidelines and frameworks tailored to the California Consumer Privacy Act to ensure compliance with state-specific data privacy laws

New call-to-action

David: Of all the Frameworks you mentioned, which is the one you love the most and which is the one you dislike the most?

Jermaine: I love the NIST Cybersecurity Framework (CSF). We've developed kind of a defense-in-depth strategy where we're all about identifying, protecting, deleting and responding to and recovering from cyber threats and privacy threats.

I love the framework because it allows you to map out where you are from a maturity perspective and allows you to plug in and say, where we are at, where we need the most help and where we need the most focused areas.

I would say the least one is GDPR compliance. This is deep and wide and complex and there's a lot of interpretation with this that you must do and then if you breach or there's a lot in fines or penalty.

David: How can businesses harness foundational AI models while reducing the potential of leaking proprietary or sensitive information?

Gavin: A lot of companies, especially in cyber, are using AI already and don't know that they are.  As an example, some of the vulnerability management tools that some of the participants in the audience might be using are actually already using AI.

AI for example, within minutes will take all of those frameworks and/or regulations that you are required to comply with, or you desire to comply with and will quickly map them.

When we're using AI and whatever language model it's using, it will learn not only new pieces of information it will draw conclusions. What our eye should be on is what is the learning model and how we're protecting our IP.

So, let's say we're about to open up in a new market or a new industry or a new geography. We want to be judicious about which AI tools we are going to use and in particular, whether is it a private language large language model or is it public, and if it learns from my questions, is it going to disseminate that into a larger language model where it's learning from a thousand companies?

So, it's truly down to understanding what the learning model is for the large language models and making sure that where it needs to be public is the best result we're going to get.

David: What do you mean by private?

Gavin: It doesn't matter what platform you're on. You could be an AWS shop, you could be an Azure shop, or you could be a Google Cloud shop. Once you’ve pulled it down, it's now only learning from your environment, and it's only learning from your people.

It doesn't mean that what it has learned on the outside is a little bit stale. You must have a refresh cycle, but once you do that private implementation of an LLM, it is private, and you get to control that it's only going to learn now going forward from your environment.

Then the challenge becomes refreshing what it would have learned if it was public.

David: How can organizations rebuild trust in the event of a data breach or privacy incident?

Christina: Accountability is one of the biggest things. It’s the first step. You have to take responsibility and ownership for what happened.

One of the biggest things that happens when you see a large corporation that has a breach is that the quickest thing that they do is a finger-pointing game. That does so much more harm than good. Typically, it's never one person or one system's failure, it's a perfect storm that causes a catastrophic situation.

In the rebuilding phase, you need to show actionable steps that are taken in quantifiable ways to showcase that you are taking measures against this.

It may be as rudimentary as redefining or reworking your privacy policy and your information security policy. Having a publicly available trust center, for example, can also assist in having a place where companies can feel confident working with you again to know that what happened before isn't going to happen again.