Earlier this year, many network security professionals were surprised to learn that data breach costs are actually on the decline. According to research from Symantec and the Ponemon Institute, the average cost of a data breach in 2011 fell by 24 percent. The cost per compromised record decreased to $194 per record, indicating a five-year low, and down 10 percent since 2010.
The reason? Many experts agree that the decline in data breach costs is directly tied to the desensitization of end-users. Fewer customers defect when their sensitive personal data is stolen or improperly accessed. Corporate reputations aren’t taking the blow they used to receive following a major security incident. All in all, the general public has become more tolerant of data breaches and the result is a lesser financial impact when they occur.
One could argue that this is good news for companies who have limited security budgets. More often than not, these companies are forced to ask themselves if it costs less to weather a data breach than to prevent one from happening. The problem is that it sets the stage for apathy. More companies may choose to focus less on actually securing data and more on how to respond appropriately to inevitable attacks. Furthermore, they may lack the motivation (and justified business case) to increase security measures in tandem with threat sophistication.
Lower data breach costs should be a warning bell to those of us in the information security community. It means we’re at a crossroads where we choose to either fight harder and smarter, or stoke the fires that burn us. The challenge is using resources wisely and right-sizing a network security strategy that prioritizes protection. Here are four tips to overcoming this challenge:
Know your weaknesses better than your strengths: If you don’t know where the weak spots are in your network, you can bet hackers will find them for you. In the balancing act of deciding on the proper level of security versus cost constraints, it’s imperative to understand the weakest links in your network so the appropriate counter measures can be implemented. For some, this may take the form of educating workers on how to securely access cloud-based data; for others it may involve fortifying a particular area of the network. No matter where the weak spots lie, taking the time to identify them now will stave off disaster in the future.
“For most enterprises, it isn’t possible to provide the highest protection against every known threat.”
– Matt McKinley, U.S. director of product management, Stonesoft
Understand the costs of protection versus the cost of avoidance: In an ideal world, protections would be deployed at every intersection of data traversal. Of course, this is prohibitively expensive and unrealistic for most companies. Here in reality, choices have to be made regarding the most important assets and how to protect them. Parallel with this is an understanding of what assets are acceptable to risk. It is critical that there be a clear understanding of the cost associated with a compromise to critical and non-critical assets. When this understanding is reached, security devices and controls can be repositioned or reformulated to ensure that the most critical assets have the proper level of protection. Avoidance, on the other hand, may be necessary to ensure that other, more critical, assets are well protected. Beware, however, that avoidance, even when well calculated, is not without cost and risk. Make certain to accurately calculate the cost of a compromise to non-critical assets as well as accounting for any path they might have to critical assets.
Develop a well-designed technical and public response plan: When compromise does happen, there are two aspects to recovery: a technical response plan and a public response plan. When the weak spots are identified, the costs of protection or avoidance have been calculated, and you accept the reality of impending risk – then you can develop a plan that will allow you to recover quickly. Hiding the compromise or disaster, much as in the case of Chernobyl, can be a PR nightmare and will only serve to exacerbate the problem. The old adage that honesty is the best policy certainly applies here. But your honesty should be calculated and, at least as much as possible, anticipated through careful planning.
Accept false causality: Finally, understand this. Statistics have a certain value, but they should serve more as a data point in multi-year trends than as individual indicators. This is especially true when it comes to security where there is no shortage of research and expert opinion. However, allowing statistics to dictate your decisions can be very dangerous. Statistics, even having the appearance of rigorous calculation, always reflect chance. Moreover, there can be multiple variables for which no account was made during the research. In the never-ending game of offense and defense in the world of security, would you want to leave anything to chance?
For most enterprises, it isn’t possible to provide the highest protection against every known threat. Even if they could, there is always a new, more dangerous and more costly threat on the horizon. The solution is understanding how to correctly prioritize network security strategies and tasks as new (and old) threats emerge within your unique security environment.
Matt McKinley is U.S. director of product management for Stonesoft. He writes and speaks frequently on how organizations can overcome the complexities of network security and the current threat landscape.