Content

Raising information exchange

The tactics of today's cyber outlaws are constantly changing. To stay a step ahead of security threats, government IT managers must adopt a new way of thinking. What's required is the ability to analyze data as it's being collected, and simultaneously put this data into historical context. The goal is to decide in real time what's actionable and what's not. That's where the emerging capability of "intelligent information exchange" plays an enormous role.

Typically in the past, information has been assembled by a well-defined collection, analytical and decision process. This process has been followed regardless of the eventual purpose or benefit to the recipient, or whether that recipient is a first responder, an intelligence community member or a civilian agency. Consequently, the freshness of this information is generally known and can be controlled to some degree.

However, the situation is different for much of the open source information that can be compiled from newspapers, bloggers and the like. Within these sources, the information has, in and of itself, a temporal basis. But frankly, this data may feature components that are trustworthy and others that are not. These degrees of veracity may change over time, as other information is either collected or becomes available later.

Decision-makers are often put into a position of using available information to make rational decisions about how to act. The information available can come from both predictable processes, as well as from more volatile and unpredictable ones. Their utility must be determined by the user and applied to the decision appropriately. Technology alone can't take the place of higher reasoning in that assessment, but it can assist in making decisions. Technology can apply analytical insight for greater perspective and deeper understanding.

Original information, as it arrives for analysis, is human-based. However, technology can help in decision-making in terms of the speed with which decisive action can be taken, the spectrum of knowledge by which the decision can be made, as well as the ability to separate the forest from the trees.

Part of the problem that analysts of all types encounter is the need to distill waves of information in a timely and expeditious fashion into something that is consumable and usable. Sometimes this may mean that technology needs to speed up the processing. Other times it may mean that technology needs to slow the process down.

Moreover, situations often arise when these analysts find that they can re-apply information they have used before and greatly reduce the overall collection process. What I am suggesting is that technology can apply rules and "learn" from previous situational processing.

For example, in the case of Hurricane Katrina, there was raw information available days in advance -- e.g., weather circumstances and the structural analysis of levees and of water flow patterns. This data was sufficient enough for an analyst to build a case for taking action in terms of evacuation, protection, public alerts and all of the measures that, in retrospect, should have been pursued. However, the problem was in making the data useful, and then making a decision to use it in a way that would have had an impact.

In most situations, there is a period of time in which the major players can go back and use hindsight effectively to capture lessons learned and apply them to a decision rule set. However, people forget or fail to think about the importance of the information and it gets lost. Below are ways to avoid this pitfall.

The importance of attribution

Attribution details and source details can limit the usefulness and timeliness of information. For example, a key issue with Katrina was in weather reporting. Weather reporting is an imprecise science. There are many aspects of weather reporting that are impacted by the source of the information.

In the case of intelligence, defense and civilian agencies, "attribution" details are particularly important. Attribution is a journalistic term that also applies here. It means 'who said it,' 'why they said it,' and 'why they meant to say it.' The 'why they meant to say it' part can often become lost. It is apparent that information is manipulated and the attributed source needs to be a strong consideration.

Shortening the loop

The need to shorten the loop between information capture, its analysis and the application of decisive action is an overarching consideration for all analysts.

An important consideration is how technology can derive data that is relevant to a situation. There are a number of ways it can accomplish this. First, information can be time-stamped at its inception, when it gets used and certainly when it becomes obsolete. Assessing the time benefit of information is critical. Also critical is determining how frequently information is used. If frequency-of-use drops off, that information is probably not as useful as it was before. However, the time frames between when this information was acted on becomes very important. From this, the benefit and timeliness of that information, based on who is using it, when they are using it and what subsequent action is taken, can be observed and it can be put into this decision process model.

The role of open source information

An analyst may end up with information culled from people chatting on the web or people blogging information. Traditionally, a person in search of data would need to look for it directly. Now, they only need to search for data that is highly critical, highly leveraged or specific to the topic and then go to this open source for the other information that they need. In this scenario, the problem is that the reduction process and analytic process becomes more difficult because there is more information to process.

For example, in the wake of Katrina, several entities discussed placing sensor infrastructures into the dams, bridges and levees. That became the problem. With this amount of overlap, they would "over sensor" the same piece of infrastructure. They needed to think about leveraging existing information in a targeted way, to produce exactly the analysis they needed for appropriate action.

Technology as an enabler

The analysis process for security information needs to start at the time it is collected. In my company, Cisco, we regularly get firewall hits where hackers "turn the knob on the door" to determine vulnerability to a particular virus or other malicious activity. What we have typically done is warehouse that information and then post-process it. The problem is that the information compounds under the load and becomes difficult to use. The message here is that there is a real need for technology to start processing that information at the time that it is collected. This process may acquire additional benefits or may be grayed over time, but it needs to be appropriately processed at the outset. Therefore, there is a need to install technology that can do the analytical work, for categorizing and time-stamping, as soon as possible once the information is gathered.

In this process, technology needs to be viewed as an enabler and an expeditor in helping decision-making. Typically, attempts are made to try to replace the elements of the processes that are traditionally used by humans to make decisions with technology. There is a need to look at those processes cyclically to refine and improve the process. Once optimization is complete, there is the need to re-observe and go through the process again. This will actually allow the ability to refine and change that process and optimize it over time. There also will be times where it's inappropriate to simply replace the process that was there before, irregardless of whether it was done by human or machine.

Marrying applications and network

Today, the application and the network are considered as two separate components that run somewhat independent of each other. The user is then faced with an integration process where one application, whether legacy or designed differently or for a different purpose, has to be bilaterally integrated with another application. The network can communicate information but not the special work of integration.

This bifurcation is a classic problem, whether it is an accounts payable system integrating with the payroll system, or a decision-making system and its subsequent action-triggers that determine that a storm is coming, the gates should be closed and a decision to evacuate made. These applications are typically built independent of one another and they have to be put together. Marrying the application and the network recognizes the fact that messaging between these applications must be done -- and this messaging can be standardized.

Therefore, the network can take advantage of the messaging that the application needs to do in the standard format and expedite the routing of it both in terms of speed, as well as in terms of work loading. This means that if a particular analytical tool can't handle the information needed for analysis now, the network can redirect the request to other available resources. For example, if a particular computing resource can't handle the processing of a satellite detail of the levee system in New Orleans and look for potential problem areas, the network can move this request over to a system that can, and still get back to the same point to make a decision in a timely fashion.

This need for application and network infrastructure to collaborate, to address a more secure and more trusted computing environment for information-sharing, is most notable within the Department of Defense (DoD). The DoD is particularly challenged in that it must protect sensitive information from certain individuals and organizations while simultaneously sharing it within a coalition environment. Within this community there are directives that establish compartmentalization of information within the same level of classification. This information-sharing environment requires a policy-based architecture that can enforce rules and privileges by controlling access and protecting data across the entire network.

Real time, all the time

Responding to realtime information in real time is analogous to addressing numerous "what if" scenarios. Unfortunately, the approach currently taken is that if a question could be asked, then the information needs to be pre-positioned in a way that will allow that question to be responded to. This gets into the business of trying to predict action, whether it takes the form of a database query system or scanning channels on a satellite TV.

The point is quickly reached where attempts are made to provide programming or information in a format and manner that can respond to a particular question in real time. This is the kind of process that needs to be discouraged, since there is also the point where information will be needed in ways that can't be predicted. This creates the need for a process that positions and characterizes information appropriately and allows it to be tapped.

This is not fundamentally different from on-demand entertainment services currently offered by some cable and satellite systems. These services do not necessarily try to predict exactly what the viewer wants to see. Instead, they try to make things available and then provide viewer access on demand. This is how technology can be a benefit in security. It can allow information to be extracted via a data tape, a portal, etc., without having to know the pre-defined question or having characterized it in any particular way.

Ensuring against vulnerabilities

Once information is extracted and the decision-making begun, a decision history or record is built up over time. This creates the situation where one might begin to create questions like: "What if someone comes behind me and sees what I am doing." For example, within the financial community, an analyst from one company conducting research on a particular market segment might not want competitors knowing about this research. The question becomes how to obfuscate the fact that this research is being done.

If the analyst visits a website, his or her interests can be tracked, unless there is protection against spyware. (The purpose of spyware is to determine a user's interests. Something else can be offered that might be of use to them, but the perpetrators also gain some insight into these uses.) Either for privacy reasons or, at the very least, for competitive reasons, no one wants others to know what they are looking at.

Gregory N. Akers is senior vice president and chief technology officer, Cisco Systems.

PARADIGM SHIFT: A road map

Below is a three-step process for forging "intelligent information exchange" systems that facilitate better and faster decision-making:

1) Develop a risk model

This is fairly straightforward notion; it is the implementation that is difficult. The availability of information, whether it is open source or at the highest levels of government security, is put at risk when it is exchanged between one party and another. The only information that is not at risk is that which is not exchanged. At some point, when a person makes a decision to share and exchange information, they also make a conscious or unconscious decision that that information, which they had previously protected will now be shared or exchanged with someone who may put it at risk in ways not intended.

Therefore, at each step along the way of an information exchange, a risk quotient should be applied. This quotient can be mathematically based or not. Regardless, it must determine potential losses incurred from any security breach (in terms of dollars, lives, etc.), and what it will cost to protect the information. By assessing these risk and protection quotients, equilibrium can be reached. When an exchange passes beyond the point of an acceptable risk, an acceptable level of disclosure or cost to protect must be a conscious decision. Technology can be used to derive equilibrium among the various aspects of risk, disclosure and cost to protect.

2) Move the analytical process up

It's crucial to push the analytical process as far up toward the gathering process as possible. The dissemination aspect of information leading from analytics must be technology-enabled for speedy, experiential decision-making. Acquiring and using analytical computer resources that project as far as possible into the future must be considered. Available computing and analytical capabilities need to be viewed on a continuum, as opposed to just for specific tasks.

In this regard, the value of the public and private partnership can't be overstated. Open source information speaks directly to this need. Public-private partnerships exist in terms of information exchange, but there also exists the opportunity to extend this model into the realm of security. The government is the user, and private industry is the technology provider. Companies are the enablers that build systems, create technology and helps users do their jobs.

3) Leverage advanced research

Innovation must be valued, nurtured and applied. Whether it derives from academia, the government or corporations, advanced research and development must be driven by the needs of the user. It's up to the government user to identify the truly difficult problems. It can be a current problem or one that is over the horizon and not addressed by technology. Industry then needs to inform government about the relevant shortfalls in private sector technology, and how to remedy them.

Problems must be viewed as an overarching research agenda. At the end of the day, no one will get better at doing their job or helping to create the next type of security measure unless all available resources are being utilized in concert.

These three steps aren't the complete answer. They're only a start. They require continual review and refinement. The risk model must be regularly assessed according to current conditions and appropriately enhanced. New and better ways must be found to push the analytical process even further upstream.

The entrenched model for information analysis is no longer appropriate for today's increasingly clever cyberthreat. This model must be reconfigured to quickly provide decision-makers with the information they need. Instead of "gather, store and analyze," we need to "gather, analyze and store."

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.