Hunting out the rogues

Some security themes keep re-occurring. Gunter Ollmann warns against one of the most common problems

In the many security assessments I do, I discover the same mistakes being made. One of the most frequent is the 'rogue host,' the machine that has been left attached to the network and for some reason, forgotten about.

This can happen in a variety of ways. Take for instance the recent case of a retailer that had two versions of a business application running within the environment – the 'live' system and the development system. While the development system was not accessible from the internet, and thus largely secure from malicious attackers, it was also difficult for the QA and test team to verify new versions of the software remotely. So they would drag a spare test workstation into the environment, connect directly to the infrastructure and carry out their testing procedures. Because the testing procedures could take many days, and because there were frequent new releases, it was common practice to leave the workstation connected.

The answer was to develop a separate (internal only) developmental mirror of the 'live' infrastructure, with well documented intra-departmental processes.

There are many reasons to avoid the rogue host.

For a start, if the workstation exists within a secure environment, it is unlikely to be secured to the same or higher security level as the surrounding server hosts. Almost all malicious attackers will focus on the easy kill – thus the workstation is a prime target.

In addition, the workstation's data is almost always valuable, especially if it is a clone or built to a corporate standard. I have found backups of firewall and router policies containing easily decipherable password hashes; mapped file shares to the 'secure' servers; application usernames and passwords; backups of the custom application software, complete with comments.

To protect themselves, organizations must monitor and test for any new introductions to the network. You'd be surprised at just how many rogue hosts appear that aren't supposed to be there. n

Gunter Ollmann is manager of X-Force Security Assessment Services EMEA for Internet Security Systems, Inc.

Stopping small problems becoming large liabilities

Niels Heinen presents the top three mistakes he sees companies make every day, with some remedial suggestions.

Configuration mistakes

For example: incorrect installation, lack of admin passwords, poorly configured firewall devices.

Products should be studied properly before applying them in your infrastructure. Once products, especially firewalls and IDS log files, have been installed, they should be actively monitored to make sure that they function properly and detect anomalies.

Lack of regular updates

Companies often lack the time, motivation or resources to diligently install software patches and keep their systems up-to-date. Administrators in some cases don't even keep track of software updates.

Companies must keep their systems up-to-date and diligently install security fixes and patches as soon as possible. They should create policies and procedures to put this in place, provide administrators with the necessary resources, and make them accountable for regular maintenance.

Insecure web applications

Dynamic web sites are often written in powerful programming languages, but with little security. Web applications often implement weak authentication schemes or may disclose information to a hacker.

Web applications should always be audited before they are placed on production servers. They should be monitored to detect malicious activity. Their engines should have a hardened configuration to counter hackers.

Niels Heinen is security intelligence lab manager with Ubizen (


Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.