The IT landscape has evolved over the past two decades. Where it is headed is anyone’s guess, reports Chuck Miller.
In information security, the scale of what happened two decades ago seems almost quaint, compared to the current state of affairs. Twenty years ago, the US-CERT listed 132 incidents of attacks against internet-connected systems. Ten years after that, Microsoft Windows 98 entered the marketplace and, some would argue, marked the beginning of security as a mainstream concern. From then on, hundreds of patches were released in response to problems in Windows.
For some time, however, much of information security was seen as FUD (fear, uncertainty and doubt). Recently, that has changed. In fact, at least in some quarters, information security is viewed as a business enabler, a fundamental part of an organization’s operations. For example, in the past, few organizations had dedicated information security officers. Now, it is more commonplace – even if some of the personnel in these positions don’t operate at the highest organizational levels. Certainly, managers realize that along with the requirement of a web presence or email, organizations need security.
Too, though the profession has grown, the environment is often seen as dismal. “Looking at things right now, I have to tell you that it is not a very pretty picture,” says Paul Ferguson, senior threat researcher at Trend Micro. “We have the rise of a professional, organized criminal element that is operating brazenly in the open, without any fear of retribution. People do not trust the internet because criminals have been allowed to run wild.”
Another recent area of concern is the growth of Web 2.0 technologies, along with social networking. “Attacks on social media have not abated,” says Ryan Barnett, director of application security at Breach Security. “That is likely to continue and grow. It’s a target-rich environment for hackers.”
Websites in the future are likely to not just be targets for attack and for stealing information, but rather platforms to launch attacks on end-users, Barnett adds.
Also, the security problem is due in part to opportunities for mischief at fundamental levels, which may only increase in scale. “Software complexity will grow,” says Paul Kocher, president and chief scientist at Cryptography Research, defining complexity as a measure of lines of codes. “And security will decrease as a consequence.”
There are many factors to explain how we’ve gotten to where we are. One that is repeatedly raised is the lack of international efforts to stem the malware tide. For instance, the majority of financial attacks come from Eastern Europe, according to Ferguson.
In addition, he points to a serious lack of cooperation from various stakeholders in the internet hierarchy. Some domain registrars are willing accomplices, enabling criminals to abuse the domain registration system. As an example, criminals register hundreds of domain names in China every day, Ferguson says. And, typically, any request citing abuse sent to a registrar in China is met with a bureaucratic response that may as well be absorbed into a black hole. It is almost impossible to get them to suspend a domain – and the cybercriminals know this.
“Traditional technologies no longer apply to face the growing complexities of organized crime,” says Anton Zajac, CEO of ESET. “Major epidemics of proof-of-concept worms have been replaced by socially engineered malware launched by large botnets.”
Where do we go from here?
Many security dimensions need improvement. For example, experts say that anti-virus (AV) companies should evolve away from just being AV detection and remediation bureaus. Considering the sheer volume of malware, it’s impractical to rely on anti-virus alone.
“The future lies in developing a distributed layer architecture that consists of a variety of different technologies,” Ferguson says. “Security companies will be defined by the threats, not the other way around.”
In addition, experts say the future will require a new kind of cooperation, particularly at an international level. The Conficker Working Group is probably the best model. The idea is to have people from industry, academia, law enforcement, regional registries and the banking community – a variety of stakeholders to share intelligence. In other words, there must be a much better public and private collaborative effort in which people are sharing information and intelligence on a much broader scale.
“Much of this is occurring now,” Ferguson says. “But it is unsustainable for small organizations on their own – and they do not make money doing it.”
In fact, he says, there are financial disincentives to providing security at some levels, especially compared to the huge incentives that cybercriminals enjoy. The point is that there are some financial barriers to battling cybercrime that security professionals must face up to and correct in the future. This will take work, though.
“Just because things will get tougher does not mean that you should give up,” says Cryptography Research’s Kocher. “The question is, how do we get more productivity using computers and still manage the risks involved?”
On top of these challenges, there is the element of government involvement and oversight. “Sometimes, the government makes things more difficult, not easier,” says Ferguson. “It’s best if government officials are a participant, but are not the main driver.”
Especially since mandates are often limited by physical boundaries. After all, the United States does not own the internet, says Ferguson. What is legal in one country may be illegal in another – and, in addition, there are cultural issues at play here that not enough people are aware of.
What can improve?
One area that is improving at a fundamental level is in operating systems – in particular, the ability to compartmentalize pieces of systems so that when one piece fails, instead of losing all the data, just that application goes down. On a longer time scale, there is hope that rather than enabling every piece of software to have the ability to ruin an entire computer, another approach will prevail, Kocher says.
“Right now, one application running on a PC can read all of the data on the computer,” says Kocher. “That is not a good thing because if that application is forced to send data out over the internet or do something it should not, there is nothing to limit it.”
As well, there is a move for better authentication, in which procedures can be implemented per incident, such as when people start to do things that they do not normally do. The system would be able to impose permissions that match more closely users’ normal behavior, Kocher says.
Still, because of the competitiveness of the technology marketplace, people continue to bring features and products to the market before worrying about putting security into it.
“Security is always an afterthought or an add-on, and that is the wrong way to do security,” says Ferguson.
Kocher agrees, explaining that it is difficult to make sure that a computer is doing what it is supposed to be doing. And this is a problem that will become tougher.
The future of security must not wait for the criminals. Ferguson says, “If things do not get better, with people working together, the picture will not get better.”
Cloud computing: Security issues
Future technologies are bound to introduce new security problems. One nascent technology, cloud computing, may be fraught with issues, but mitigation efforts are hard on the heels of its mushrooming growth. IT research company Gartner recommends that these issues be addressed in the cloud.
Privileged user access. It is imperative to track who has specialized access to data and who has the responsibility for hiring and managing of privileged administrators.
Regulatory compliance. Vendors must be open to permitting external audits and/or security certifications.
Data location. Provider must be prepared to enable control over the location of data.
Data segregation. Encryption must be available at all stages and encryption schemes should be designed and tested by experienced professionals.
Recovery. It should be easy to determine what will happen to data in the case of a disaster; complete data restoration must be available in short order.
Investigative support. It should be possible to readily investigate any inappropriate or illegal activity.
Long-term viability. Data must be preserved even if the company goes out of business, and data should be transferable in appropriate format. – Chuck Miller
Timeline: Mandate influences
Often the future of computing security is shaped by legislative mandates. Here are some examples at the federal level:
1999: Gramm-Leach-Bliley Act (GLBA). This law was enacted to protect consumers’ personal financial information held by financial institutions.
2000: GISRA (Government Information Security Reform Act). The law (later superseded) required federal agencies to perform periodic threat-based risk assessments for systems and data.
2000: E-Sign (Electronic Signatures in Global and National Commerce Act). This law made digital signatures legally binding in interstate and foreign commerce.
2001: USA Patriot Act (Uniting and Strengthening America by Providing Appropriate Tools to Intercept and Obstruct Terrorism Act). Established after 9/11, this law increased the ability of law enforcement agencies to search telephone and email communications records.
2002: E-Government Act of 2002. This included the Federal Information Security Management Act of 2002, which replaced GISRA, to establish a framework for improving security in federal systems.
CAN-SPAM Act 2003: (Controlling the Assault of Non-solicited Pornography and Marketing Act of 2003). This law established a national standard for sending commercial email and required the Federal Trade Commission to enforce provisions. – Chuck Miller