Cybersecurity teams spend the majority of their time trying to figure out exactly what is happening on their networks. They would be better served trying to focus on how often things happen on their network.

That’s not to discount the value of being able to point at a system and figure out exactly “what is going on” – but an emerging best practice, frequency analysis, is the root of the question of how often things happen.

Frequency analysis involves an automated network assessment to determine which user processes are run by the many and which are run by the few. Why is this important? Because adversaries are more likely to compromise an enterprise and explore within by staying low-key, limiting themselves to a select few internal machines.

Sure, there will be massive assaults like the Code Red worm of days past. But, let’s face it: if you’re hit by something that big, you’ve probably already read about it in the news, and can immediately evaluate whether you are, indeed, among its victims.

Frequency analysis attempts to hunt down the attackers who prefer a subtle approach. It all comes down to simple numbers: you run an automated scan of processes and software installations, among other things, and find out which ones are present within a very small percentage of computers – maybe three, fave or 10 percent. Then, you ask yourself how many legitimate processes exist within the network on such a small scale. Aren’t most programs deployed enterprise-wide or at least department-wide? Yes, most are (though obviously some exceptions exist). Therefore, it is more likely that the small-scale programs could have been introduced by a threat, thus demanding a closer look.

This practice promises to save security teams vast amounts of time. Currently, they spend too many hours checking computers on an individual basis, chasing log files and anti-virus warnings. Think about it: if there are 5,000 employees/users, is any business honestly going to hire hundreds of IT pros to ensure every machine is clean once a week? That would be an absurd allocation of the tech budget – a huge price to pay to ensure network security when the investment could further IT innovation instead.

We realize traditional tools aren’t up to the job anymore. We’ve gone through the same routine over and over. You run one anti-virus product on your network, you use websites that scan potentially malicious files against 50 different antivirus products and they tell you whether anything is fishy within your systems. But the bad guys are using these products too, so they create a file which none of the products have flagged. Then, the file is introduced into the network, steals data, shuts down operations and otherwise inflicts damage, and the anti-virus companies play “catch up.” They produce a matching signature and declare that their customers are “cured” of the new malware. Then the bad guys come up with something else – rinse and repeat – in an endless arms race.

With frequency analysis, you’re not running checks of 5,000 computers a week in sequence. You’re looking at the process lists (and other data points) for all 5,000 computers simultaneously, then seeing which entries are present in, for example, 4,900 of those machines and which ones are limited to just a few.

What happens if, while doing this, you find a program called “Evil.Exe” running on just three computers? The next step is examining out how it’s interacting within the network, and if there are any strange, external exchanges involving the program. You want to track down which users have the program on their machines and how it got there. If it’s the head of R&D, an IT administrator and the CEO’s secretary, then you certainly should conduct a thorough assessment of what’s going on.

So let’s say Evil.Exe turns out to be completely legit. Did the security team waste its time? Hardly. Because now you know what Evil.exe actually is. You understand that Evil really isn’t, well, evil. At least not yet. When it comes up again, you can probably ignore it, especially if it hasn’t spread to other users and its general activity level is pretty much the same. This frees up the team to look for new programs which have entered a select few of the users’ systems.

Frequency analysis doesn’t pretend to be some kind of “secret sauce.” It’s just simple math. Nevertheless, organizations need to move in this direction, to build security practices that are as efficient as they are protective, especially for large networks.

Otherwise, you’re back to buying a long line of anti-virus products which are outdated as soon as a single adversary acquires and runs them to create an unknown piece of malware. And you’re attempting to check hundreds of machines, one-by-one. Wouldn’t you rather cut to the chase? If so, that’s where frequency analysis will help you.