Are You Putting the “P” in DLP?

Guest blogger: Derek Brink, Aberdeen Group

Data loss prevention (DLP) solutions are designed...well, to prevent the loss of enterprise data. Said a bit more formally: by “loss,” we mean the confirmed disclosure of an organization’s data assets to an unauthorized party—i.e., a data breach. Said still another way, DLP solutions are designed to reduce the risk of a data breach.

This begs an obvious question, which unfortunately doesn’t often get a crisp response: just what is the risk of a data breach? To answer this question in a way that’s useful to an organization’s senior leadership team, security professionals and solution providers have to consider both the likelihood that a data breach may happen in a specified period of time, as well as the resulting business impact if it actually does occur. That’s just the proper definition of risk.

To help address this glaring need, Aberdeen continues to look for ways to leverage the growing body of publicly available data regarding the likelihood (e.g., Verizon DBIR), size (e.g., Thales eSecurity breachlevelindex.com), and business impact (e.g., Ponemon Cost of a Data Breach) of data breaches to quantify the annualized risk of a data breach, as risk is properly defined. That is, not as a falsely precise, single-point estimate, but as an estimate of a range of possible outcomes and their associated likelihoods.

For example, for the private sector as a whole (all industries), Aberdeen’s Monte Carlo analysis shows that the median total business impact of a data breach is about $500K. Even more importantly, however, there's a 10% likelihood that the total business impact of a data breach is more than $1.8B. This is the “long tail” of risk that’s so important to help the senior leadership team to understand – this is the part of the risk exceedance curve which has the greatest influence on how business decisions ultimately get made.

As a point of reference, Aberdeen’s analysis of publicly available data (e.g., NetDiligence Cyber Claims Study) also shows that the median payout of cyber data breach insurance claims is about $80K — which means that cyber insurance payouts are covering less than 20% ($80K out of $500K) of the total business impact at the median, and less than 2% of the total business impact ($26.3M out of $1.8B) at the long tail.

Data breach cost analysis

And that is why we need to address the issue of putting the “P” in DLP. How much of that risk is your organization’s senior leadership team willing to accept?

Using content-aware, monitoring/filtering technologies such as DLP to identify valuable or regulated data is necessary, but by itself that isn’t enough. Having the means on the front end to accurately identify and classify data that needs to be protected is an important prerequisite for the ultimate goal: a flexible, automated capability on the back-end to enforce security policies and protect the data.

Aberdeen’s research has shown that organizations with DLP initiatives generally use three high-level strategies to enforce their security policies and protect their data:

What can be done to protect cardholder data more effectively in these unstructured, endpoint-oriented use cases?

  • None/Passive: this approach corresponds to controls such as logging for audit purposes, and sending notifications to administrators, users, and/or managers. Many would refer to this as a “learn mode” approach, in the sense that it helps to provide a baseline of how valuable or regulated data is being used, without creating friction or disruption in the organization’s workflows.
  • User-based: this approach requires a human (an administrator, or a user) to make a decision about the data that has been identified, and the controls that should be invoked to enforce the organization's policies. For example: the DLP system might identify customer data that needs to be protected in compliance with data privacy regulations, and the user needs to decide to encrypt it (and how) before it moves to the next phase of the business process.
  • Automated: this approach refers to controls that are invoked automatically for protecting data that has been identified as valuable or regulated by content-aware technologies, as exemplified by automatically invoking a solution such as PKWARE’s DLP inspection solution, which applies policy-driven encryption while enabling DLP inspection of encrypted data.

The time-tested strategy of “first crawl, then walk, then run” is a pragmatic approach for successful, enterprise-wide rollouts of data protection initiatives, and Aberdeen’s research has shown that DLP initiatives are no exception. Running a DLP solution in passive mode generates valuable visibility and insight into the current flow of information throughout the extended enterprise, and reduces the likelihood of inadvertently bringing the flow of information –and the business itself—to a halt.

As DLP initiatives mature, however, an automated approach to protecting valuable or regulated data is the key to putting the “P” in DLP—and helps to achieve the goal of reduced risk, along with support for higher scale at lower overall cost.

PKWARE’s DLP enhancement technology allows DLP systems to inspect encrypted data in outgoing traffic, allowing organizations to enforce their security policies without broken workflows or manual intervention.

Derek E. Brink, CISSP

Vice President and Research Fellow, Aberdeen Group
Adjunct Faculty, Harvard University and Brandeis University
www.linkedin.com/in/derekbrink

Find more posts by: Derek Brink