Threat Management, Data Security

20 years after the 9/11 wakeup call on intel sharing, how far have we come?

U.S. President George W. Bush signs the Homeland Security Appropriations Act of 2004 from Secretary of Homeland Security Tom Ridge as he is introduced at the Department of Homeland Security Oct. 1, 2003, in Washington. The $30 billion spending bill was the first-ever for the new department.  (Photo by Mark Wilson/Getty Images)

In a 2015 interview, I spoke about the events of 9/11 with Tom Ridge — Pennsylvania governor at the time of the attacks, who became the first secretary of the Department of Homeland Security a couple years later. He recalled the sudden shift that occurred seemingly overnight, from a Cold War strategy of need to know, to a strategy of need to share.

 "If you wanted to know what the biggest challenge was at the outset, it was securing relevant information in a timely way," he told me six years ago.

And today, 20 years later, that challenge persists.

That’s not to say progress wasn’t made. Events that day served as a clear wakeup call to government about the dangers of operating in silos and the need for better coordination. From a cyber perspective, 9/11 influenced what eventually became the National Strategy to Secure Cyberspace, an effort to encourage and enable information sharing by ensuring it would not compromise the security or integrity of data or networks in the process. That, in turn, was followed by many new iterations of cybersecurity policies — the most recent of which we’re seeing right now from the Biden administration. DHS was established, which was transformed many times over in the last two decades and eventually took on a leadership role in securing federal networks and coordinating with the private sector. And certainly, acknowledgement emerged about the critical role companies play in supporting the security missions of government.

And yet, much remains similar. We've been caught flatfooted more than once, not necessarily by terrorists but by China, Russia and cybercriminals that successfully infiltrated networks to threaten national security. These were not tragedies; but they were technological crises. Stan Soloway — former deputy undersecretary of defense for acquisition reform during the Clinton administration — told me for that same piece I did for Federal Times in 2015 that “we’ve made progress and shined a light, but 10 years out you would hope we would’ve gone a little further.” When I caught up with him today, he called it “sobering to realize how many of the challenges we discussed remain unaddressed.”

The question he continues to ask is if the so-called solution to establish a baseline of cybersecurity across the public and private sector — namely the  Cybersecurity Maturity Model Certification — is ultimately workable, and whether it could instead “become another barrier between government and the broader tech sector.” CMMC was, of course, created to unify cyber implementation across the defense industrial base, but it’s faced its share of hurdles. The Department of Defense has remained rather mum on the effort for a number of months.

Said Soloway: “When you think of cyber as both an offensive and defensive capability, and all the tech being commercially developed around it, I think the jury’s still out” on whether the existing policies get it right.

But is such a thing — getting it right — even attainable? Today's cyber struggles are not the same as they were 20 years ago, or even 10 years ago. 9/11 taught us hard lessons on the need to securely exchange information, but it did not necessarily predict adversaries infiltrating the supply chain to gain access to public and private sector networks. Nobody could predict in 2001 just how reliant government would be on private sector companies for both technology and intelligence. We did not know then the nature of the threats that would target critical infrastructure — in part because we had not yet even defined what constituted critical infrastructure in the first place. Ransomware for that matter did not exist, at least in its current form; nor did cryptocurrency, its primary enabler.

So, as the saying goes, much has changed — but much remains the same. And therein lies the omnipresent struggle of technology: advancements continue, which for all their staggering potential, also introduce risk. And, as Soloway told me, “we don’t really have a framework to address that; we’ve only just started talking about it.”

“There’s never been a time when you can fall further behind faster,” he said. “And we’re still playing catch up.”

So would the advancements made in intelligence sharing and security prevent a terrorist attack of a similar magnitude today? I hope so. I like to think so, even. And maybe that unto itself is all we can ask for.

Jill Aitoro

Jill Aitoro is senior vice president of content strategy for CyberRisk Alliance. She has more than 20 years of experience editing and reporting on technology, business and policy. Prior to joining CRA, she worked at Sightline Media as editor of Defense News and executive editor of the Business-to-Government Group. She previously worked at Washington Business Journal and Nextgov, covering federal technology, contracting and policy, as well as CMP Media’s VARBusiness and CRN and Penton Media’s iSeries News.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.