Network Security, Vulnerability Management

Closing the gate: Data leak prevention

The Forty Thieves had a problem named Ali Baba. Stealthily penetrating their treasure lair, in the famous “Arabian Nights” tale, he made off with a load of gold coins and threatened to come back for more. Although the story has many a twist and turn, the thieves' draconian measures to protect their treasure in the face of this “security breach” ultimately failed.

It's a lesson for today's CISO for whom security measures far more arcane and complex than a simple “Open Sesame” password are required to guard corporate treasures. Yet, as many have found, systems are always going to be breached. So, an additional focus needs to be placed on making data “exfiltration” far more difficult, whether the breach is accomplished through an insider or via undetected malware.

OUR EXPERTS: Stopping leaks


James Bindseil, president and CEO, Globalscape 

Anton Chuvakin, research VP,
security and risk management, Gartner

John Pescatore, director of emerging trends, SANS Institute 

Peter Tran, senior director, worldwide advanced cyber defense practice, RSA

Randy Trzeciak, technical manager of the CERT Insider Threat Center, Carnegie Mellon Software Engineering Institute 

Wade Williamson, director of product marketing, Vectra Networks

“Outbound traffic is the key enabler of modern attacks – it links internal malware to the outside attacker, allowing a near infinite ability for the attack to adapt and spread over time,” notes Wade Williamson, director of product marketing at Vectra Networks, a San Jose-based vendor of cyber attack detection technology. “In addition to the control functions, outbound channels represent the actual path of loss where key data and assets leave the target organization. In short, it's the source of both harm and complexity in modern attacks,” he says.

Of course, detecting outbound traffic is just a first step. A possible symptom of data leakage is increased use of external sites and the most obvious means of detecting that leakage is to implement a network monitoring and data loss prevention (DLP) system, which can help to identify information leaking from the organization, says James Bindseil, president and CEO of Globalscape, a San Antonio, Texas-based provider of secure file transfer solutions. “More generically though, you need to make sure all of the different ways that leakage can occur are protected, and it is important that all communications mechanisms are a part of the DLP solution,” he says. For example, leveraging tools that can integrate into the broader security and DLP solutions, through methods such as internet content adaptation protocol (ICAP) integration, can provide warning signs that can indicate a problem.

In fact, notes Peter Tran, senior director - worldwide advanced cyber defense practice at RSA, a Bedford, Mass.-based network security company, a traditional perimeter-only defense approach is not effective any more given the overwhelming porous nature of networks today and the increasing requirement for global interconnectivity. That implies, in his view, crafting a strategy to combine different security methods. Thus, a risk-based approach to cyber defense is needed that considers which assets are most critical – with business context and a risk index tied to business impact or loss. “This approach should be implemented across multiple domain areas – such as incident response, cyber intelligence, analytic intelligence – to provide balanced capabilities across critical security operational areas in addition to traditional layered defense-in-depth,” Tran explains.

He says in most cases the first priority in detecting data exfiltration or “leakage” is anchored on an intelligence-driven security strategy he calls the “cyber defense triad,” which is an organization's capability across people, process and technology. To achieve this strategy, organizations need the ability to identify “people” who may be attacking and the how and why they are targeting your organization. Further, it is vital to understand the process and gain insight via host and network behavioral analytics of the attacker. This means having the right technology so that data never leaves the perimeter.

“The output of this analysis – combined with workflow and process automation – helps analysts in a security operations center (SOC) to establish a visualization of the threat infrastructures being used to compromise specific high-value areas of a given organization's network,” Tran says. Security practitioners can then perform infrastructure takedowns to disrupt these covert channels from communicating outbound, he adds.

That constitutes the basics, in Tran's view. But there's more, much more. He said it is also increasingly vital that organizations have the ability to monitor and detect for pre-weaponized covert channels piggybacking off legitimate outbound communications to partner or supply chain trusted connections. This is commonly referred to as the inside-out agent challenge, he notes, and it happens when an attacker takes advantage of trust relationships between multiple entities and then uses legitimate channels as “data mules” to exfiltrate data by way of multiple hops and dead drops, called “switch targets.” 

“These inside-out agents are extremely difficult to detect due to the lack of overt network anomalies,” he says. One approach to detection in these cases is to look for smaller deviations in data communication sizes, timing, artifact lateral movements, machine to machine (M2M) role-based authentication violations and failed login attempts. “In aggregate, you are able to build a risk profile and flag for the behavior within set parameters before successful outbound communication may occur,” he notes.

While the outside threat is paramount, insiders still represent a huge problem and can be merrily exfiltrating data without detection while IT is focusing its energy on malware and APTs.

The “people” problem can be thought of in two ways, says Tran (left). People can be one of an organization's best lines of defense as a force multiplier (human intrusion detection). With the proper end-user security awareness training they can spot and report suspicious activity in real time before any wires are tripped. On the other hand, they can be a serious risk prone to social engineering cyber attacks, poor IT hygiene or actual insider threats. 

“Protecting data and systems from unauthorized access while, in parallel, making the right systems available to authorized personnel is the main objective of an effective cyber defense practice,” he explains. Simple passwords and basic data protection methods are becoming less effective, so technologies such as multifactor or adaptive authentication, biometrics, out-of-band PINs and even voice callbacks are being used as external threat triage and countermeasures. “This is a risk-based approach to prevention of the people problem by aligning the right technology instrumentation, policy and process,” he says.

However, while technical solutions that block the transfer of data outside the organization and monitor network activity can be helpful, says Globalscape's Bindseil, they depend on predefined policies about which type of information needs to remain internal. “This kind of solution requires a complete knowledge of the information that is classified as opposed to what is publicly consumable,” he says.

In fact, security starts with knowing what your critical data assets are, says Randy Trzeciak, the technical manager of the CERT Insider Threat Center at the Carnegie Mellon Software Engineering Institute. “If you don't know what they are and who has access then it is hard to either detect or protect,” he says.

Thus, he notes, a solution – whether the threat is internal or external – starts with an organization implementing tools and configuring them to the environment. But, manual tagging is central to an inventory of data assets. With an inventory in place and tools, such as DLP, focusing on movement within the organization or to the outside, security pros can begin to understand what is suspicious or anomalous, he says.

Other experts warn that it is an illusion to believe that if an organization buys a DLP tool it will suffer no data loss as a result. “That has finally fizzled from most minds,” says Anton Chuvakin, a research vice president in the security and risk management division at information technology research and advisory firm Gartner. 

“Lately, I have spoken to people who claim that DLP cannot work at all against advanced attackers – like APT – exfiltrating stolen data,” he says. “I don't think that is true as I am aware of examples where a DLP tool was useful for detecting such data theft by an outside party.” However, as more advanced attackers focus on data theft, DLP has to either become smarter or other tools (like those focused on network forensics and deeper analytics) may step in to fill the gap. As a result, he explains, DLP is one of the tools that organizations can use to discover where the sensitive data is, monitor and occasionally block leaks, and detect when users handle the data in a risky manner. In other words, DLP is one of the data security tools, but having DLP definitely does not equate to automatically having data security.

Furthermore, he notes, attempting to make a more robust wall against exfiltration won't work either. “You are going to need to have a system of controls on data storage, data movement and data usage, coupled with robust processes and, of course, with skilled personnel,” he explains. In fact, one of the best practices for success is close involvement of business unit personnel and data owners (for all of policy definition, data classification and alert response). “Some consider this to be a foundational ingredient without which the entire DLP deployment will fail,” he says.

In other words, says Chuvakin, DLP products help protect data, not infrastructure. Thus, business unit and data owner involvement is critical at an order of magnitude more critical than other IT security projects. “Ignore it at your own peril,” he says.

Although DLP has been overhyped, it can be an important component, agrees John Pescatore (left), director of emerging trends at the SANS Institute. However, he thinks it is important to aim even higher – for encryption, which is perhaps the ultimate solution.

“Encryption is hard to do because it limits the free movement of data – someone on each end needs the key,” he says. “But since it may have to become more widespread, companies should consider piloting it now as a model for future control of exfiltration.”

For instance, keeping the focus on how critical data is, encryption could be used to secure sensitive communication among board members. Then, in the future, the lessons learned from this experience can gradually be applied more broadly, he says.

“Data exfiltration is like insulating your home,” Pescatore says. “There are hundreds of places where heat can leak – on top of which there are times when someone accidentally leaves the door open. So as you aim for a solution, you need to keep your eye on each of the potential leakage points.” 

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.