Anatomy of a data breach: Security from the inside out
Anatomy of a data breach: Security from the inside out
Knowing exactly where an organization's proprietary data lies and having the confidence that it's safe has never been more important – or costly.
 
The cost of manually identifying, monitoring, auditing and protecting critical and highly sensitive proprietary information can run in the millions. It's a time intensive process that isn't always feasible, especially in today's economy.

As organizations cut IT budgets and tighten resources across the enterprise, these data auditing approaches need to be repeatable, efficient, and ultimately, demonstrate a significant return on investment (ROI).

The “right” data auditing approach: Manual vs. automated

Internal audit standards and regulatory compliance, privacy, data governance, security and risk management requirements are driving today's organizations to utilize two types of data auditing processes: Manual and automated.

Both data auditing approaches address three core challenges:
  • Data discovery and risk assessment: Discovering the key data bases and warehouses holding sensitive data within the enterprise, and quantifying risk levels (e.g., unencrypted PII information).
  • Data auditing: Keeping an audit trail for high-risk activities – like privileged user data operations -- and access activity around compliance data. These types of activities are usually escalated in the form of periodic reports to auditors and compliance managers.
  • Data activity reviews/alerting for protection: Real-time monitoring and reviewing of high-risk data repositories, so that data non-compliance, theft, leakage or compromise can be detected and mitigated. Most compliance regulations require reviews, but being able to do reviews across millions of activities is laborious. Additionally, large-scale data theft is usually traced back to databases so having a real-time form of alerting and risk mitigation is desirable from a security standpoint.
Organizations just getting their feet wet in data auditing usually start with manual approaches. Many learn, however, that this approach can impose a significant cost burden and can also be distracting to the core mission of IT.

An automated approach is another method that has emerged in the past few years. While it requires a small up-front investment, it can significantly reduce the cost and overhead of data auditing and free up enterprise resources in the long-term.



The data auditing lifecycle
Before diving into a side-by-side comparison of costs, it's important that organizations understand the six steps of a data auditing lifecycle, common to both manual and automated approaches:

1. Scoping: This step involves identifying and classifying data stores that are in scope for compliance. It also involves identifying the specific activities, users and data relevant to the actual compliance requirement. In many enterprises, this step is done in an ad hoc manner, without a systematic discovery methodology that can discover and classify scope. As a result, scoping becomes inaccurate and in many cases can't be repeatable across multiple quarters or years.

2. Collection: This step involves logging transactional data activities from databases or file servers. It's one of the hardest steps since it typically requires figuring out how to turn on native auditing within the application to capture required activities. Native auditing relies on extra logging by the database leading to significant performance impact on the application. Additional capacity may need to be provisioned for this type of auditing. Some type of activities – like logging disclosure activity in an application -- may not be available by native auditing. Additionally, native auditing creates log data within the application, so there is an immediate data management problem. How often does one clear out this data and where do they put it?

3. Storage: All transactional activity that is logged must be stored for at least six months so auditors and security personnel have the opportunity to analyze it, if needed. It must also be on-hand in case custom reports are required. This step requires enterprises to provision an additional database or file server where data can be stored. Along with storage comes the requirement of security. It's absolutely vital that the integrity of the data remains not only high, but also secure. Servers holding compliance logs are typically locked down with strong audit and multilevel access controls.

4. Filtering : Once the transactional activity is available, enterprises must write home-grown filtering tools to reduce the data to compliance-relevant information for reporting. This requires non-trivial manual development effort. Native auditing, in some cases, may only need reformatting. In other instances, it may require complex correlation to create compliance-relevant activity. Obviously, all bets are off when native auditing misses relevant activity altogether.

5. Reporting: Auditors require reports, which are the basic currency of all data auditing. However, delivering reports that “work” isn't easy and enterprises usually struggle to develop them. They must contain information at the right level so auditors can quickly understand the risk level of the specific IT control. Application owners are also required to share their interpretation with context behind the reports with auditors. The reports need to answer critical questions -- like who are the privileged users, what activities are they doing, which activities are high-risk, what critical data assets should be monitored and what constitutes privacy violation? Customizing existing reports and business intelligence platforms isn't easy because they lack the domain expertise of compliance. When all other options fail, enterprises sometimes dump all of their data on the auditors in “security through obscurity” or “compliance through exhaustion” approaches, which don't work.

6. Analytics and reviews: From a risk management standpoint, this last step is the most critical. All reports and compliance-relevant audit trail activities must be reviewed to assess the risk. Any outliers, anomalies or suspicious activity must be flagged and any violation to compliance policy must be tracked down. This means that data auditing reports shouldn't just be generated and shipped, but they must also be reviewed and attested. From a resource standpoint, this step is very expensive because it requires IT staff to manually thumb through thousands of audit entries every single day. For example, a retail employee might spend six hours each day going through PCI logs. While these types of compliance reviews are essential, they can distract IT from their core mission, reducing productivity.

The real costs of a manual approach
Each of the above steps carries costs in the manual approach depending on the scale and depth of the step. A departmental project, for example, requiring a manual data audit of 100 databases for a single compliance audit like SOX might accrue costs of $2.2 million over a three year period (see Figure 1).


Figure 1.

Assuming each database costs about $50,000 from an overall IT infrastructure perspective, the data auditing “tax” per database in this example is roughly $2.2 million divided by 100. This equates to $22,000, or 40 percent of the total database costs. In this example, the enterprise is spending an extra 40 percent of its IT budget and resources on compliance – certainly not an acceptable number.

It also means that IT staff is spending a disproportionate amount of time on compliance, instead of the business-facing aspects of their applications like availability, functionality and performance.
 
The real costs of an automated approach
Automated approaches, on the other hand, can significantly reduce the cost of data auditing by offering turnkey capabilities that encompass the same steps. Automated approaches typically involve installing data auditing appliances that can support multiple databases simultaneously. With the right ease of use and scalability, this appliance-based approach can enable organizations to cut costs dramatically.

Using an automated approach, the same 100-database departmental project would cost $586,000 – a fraction of the $2.2 million price tag associated with the manual approach. The productivity gains are also significant (see Figure 2).


Figure 2.
 
To further build the business case, it's helpful to package the benefits into three relevant metrics: ROI, payback period and cost reduction. ROI frames the relative benefit of an investment in avoiding a much larger organizational cost. Payback period indicates that while data auditing requires an up-front investment, it is relatively small in the scope of ongoing compliance costs being borne by the organization. Cost reduction is a way of representing the reduction in hard and soft costs of compliance. Depending on the organization, any one or combination of these metrics can be helpful in articulating the business case. These metrics are defined in Figure 3.


Figure 3.