We are our own worst enemy
We are our own worst enemy
It is tough being in cybersecurity. Defense is a cost center, and it's hard to find meaningful metrics to demonstrate success. Interest in security is also cyclical: Major breaches stir action, but as time passes, interest and resources wane, though the threat is still there. Yet the biggest problem with cybersecurity is ourselves. Before we can succeed, all of us must agree to change.

We can start by getting a handle on our language and defining our terms. Just about every adjective applied to malicious activity or code is subjective. There are no widely accepted definitions for what is “advanced,” “difficult,” “sophisticated” or “complex.” Why does security get short shrift? Because it is hard to take people seriously when their words can mean anything and they're so hyperbolic.

Related to our language problem is the desperate need to end the use of war analogies. The stupidity of phrases like “digital Pearl Harbor” doesn't require further elaboration. “Cyber deterrence” only makes sense if there were any meaningful analog between the lasting impact of using nuclear weapons versus digital ones. “Digital arms control” is such a non-serious idea as to be laughable. Legacy futures make for great newspaper copy and think-tank literature, but proposing solutions for a world that doesn't exist isn't helping the world that actually does.

We desperately need to do more critical thinking. So much cybersecurity analysis is pseudo-scientific, sometimes to the point of being on par with astrology. There is nothing more intellectually lazy than pointing to an IP address as “proof” of a source of evil. It's not that others aren't stealing our ideas and property, but no country has a death-grip on every byte that enters or exits systems within its borders. Any country that is advanced wouldn't need to steal secrets. Yet in every report about cyberespionage there is a line akin to “all signs point to this being the work of country X” – without any critical analysis. There are 20 (G-20) “major economies” in the world, 31 “high income” Organisation for Economic Co-operation and Development (OECD) member nations, and 35 “advanced economies” per the International Monetary Fund – all of which could benefit greatly from the intellectual output of American engineers and scientists. But since we're so heavily invested in preparing to fight a conventional war with just two adversaries, that's who we blame.

When presented with the opportunity to discuss cybersecurity problems, we should actively campaign against the use of false authorities. Our world is filled with security celebrities whose Q-scores are disproportionate to the breadth of their actual expertise. When we launch people into space, we seek comments of former astronauts, not glider pilots. Yet no one thinks twice about asking an expert in cryptography what they think about botnets.
One suggestion: When asked about an issue outside of one's area of expertise, offer access to a true expert instead. We need less commentary from the most glib, and more insight from the most knowledgeable.

Finally, and I can't stress this enough, we need to appreciate and promote our history. I have computer security books that were printed in the 1970s. If you didn't know The Cuckoo's Egg [which details a computer hack] took place 25 years ago, you'd think it was documenting events that happened last month. In fact, everything Cliff Stoll did ad hoc – computer network defenses, honeypots, public-private information sharing – are things we're still struggling to get right today. The echoes of history should inform, not haunt us, if we're to succeed.

Michael Tanji is a former intelligence officer and the CSO at Kyrus.