Application security, Security Staff Acquisition & Development, DevSecOps, Leadership

New SEI CERT chief and first ever federal CISO: old cybersecurity models have ‘been overcome’

Gregory Touhill, former federal chief information security officer and deputy assistant Homeland Security secretary for cyber security operations, seen here at a House Foreign Affairs Committee hearing in 2015 in Washington, DC. Touhill was named director of Carnegie Mellon University’s CERT in April. (Photo by Mark Wilson/Getty Images)

On April 21, Gregory Touhill was named as the new director of the CERT team at the Software Engineering Institute (SEI), a non-profit, federally funded research center at Carnegie Mellon University in Pennsylvania that partners with stakeholders in government, industry and academia to study and improve the cybersecurity ecosystem.

Touhill brings a rich and diverse background to the role, having spent years protecting military computer networks as an Air Force brigadier general and later serving as director of the National Cybersecurity and Communications Integrations Center at the Department of Homeland Security. He was then appointed as the first-ever U.S. chief information security officer.

SC Media caught up with Touhill this week to learn how he hopes to make an impact in his new role, what issues and projects he plans to prioritize in his first year and how the old cybersecurity models we’ve relied on no longer work.

What attracted you to this role as CERT Director at SEI and does it allow you to address or tackle some larger cybersecurity issues from a different perspective?

Touhill: The Software Engineering Institute and CERT are a world leader in cybersecurity and if you go back and look at the history and the lineage of the organizations, I’ve been engaged with [them] since their inception.

SEI was created because the Department of Defense recognized that we needed a federally funded research and development center just focused on software, because we have very software intensive command and control systems, weapons systems and the department was very prescient in recognizing that industry, the economy – all of that was becoming increasingly reliant on information technology.

In 1988 we had the Morris Worm, if you remember from the history books. I lived it. So we worked to create what was then called the Computer Emergency Response Team, the very first one.

Now we’re just CERT, we’ve grown beyond computer emergency response and within SEI, we do have three big things for not only DoD, our principal sponsor but across government and industry.

One, we work to modernize software development and acquisition, because code is fueling society. Two, we work together as a team within the university and across industry with government to work towards attaining autonomous cyber operations and resilience. Being able to actually build and maintain and operate systems that are resilient to attack. You know, being able to take a punch and then keep on going. And then third, we’re trying to realize computational and algorithmic advantage. That basically means we got better code than anybody else and that’s really important to DoD.

But ultimately, what we’re trying to do is to reduce the risks to national security and national prosperity by hardening and strengthening that cyber ecosystem. It’s a world class organization, I’ve worked with them in all my different roles and jobs in the military and civilian government, up at the White House and in industry. What a great honor to be asked to join this team and to be the new CERT Director. I’m absolutely thrilled.

It does seem like more and more, we’re finding that solving or mitigating big cybersecurity problems really tends to require a lot of cooperation and coordination between industry, the government, law enforcement, allies, academics and researchers. Do you think this role positions you well to play a part in some of that coordination?

I think it’s a strength. If you think about where I’ve been and my contribution to the team, yeah I’m an old guy so I’ve been around the block a few times, but I have developed a rich network that can help amplify the great network that our brilliant team has as well.

So working across the military, across government, across industry and across academia is one of the strengths of the Software Engineering Institute and Carnegie Mellon writ large, and as part of the CERT, we have a brand that’s been around in cybersecurity.

You spoke about some of SEI CERT’s strategic goals earlier. It’s still early days in your tenure, but do you have a sense for what some of your top agenda items will be over the next 6-12 months?

We recognize that we’re going to have to modernize software development and acquisition, and that’s a constant quest. We’ve been trying to do that for years and as new technologies come into play, that modernization and optimization is critically important.

When we look at that second goal of attaining autonomous cyber ops and resilience, that’s really kind of a nod towards some of the things our teams are doing with things like artificial intelligence and machine learning – and even quantum – and supporting our customers in government and the military as well as advising those in industry about how to integrate those new and emerging technologies,  looking over the horizon and making sure that we are secure by design.

Trying to maintain computational and algorithmic advantage, we want to make sure that not only are we being secure by design, but we want to make sure that the whole ecosystem is properly addressed. That includes the architectures, the computing platforms, the algorithms and the people and the process as well. Cybersecurity is not just about the technology, it’s about people, process and technology, and I don’t think there’s any better place in the world than the Software Engineering Institute and Carnegie Mellon where we fuse it all together to build and support the strongest system of systems.

We’ve seen the speed and cadence of hacking groups increase substantially over the past two years. I’m curious how you evaluate the cybersecurity industry and IT security teams when it comes to matching their technology and process to that increased pace?

That’s a really interesting question. I don’t know if we have time for a fully fulsome discussion on that, but I think there’s a couple of nuggets I could seed.

First of all, we need to change our game plan, because the traditional cybersecurity tactics, techniques and procedures that we’ve used for many years are no longer working the way we need them to be. A great example is perimeter defense.  We would build our architectures with that perimeter defense model where we’re going to have a firewall and we’re going to deny everything except for those things that we want to let through.

And that’s been overcome. That model has been overcome by things like [smartphones] and mobility and the firewalls are very difficult to configure and maintain. We’ve drilled holes in with VPNs, which are…25-year-old technology. So we’ve got to rethink things, and I think the Department of Defense and Department of Homeland Security and [Federal CISO] Chris DeRusha came out and reaffirmed a zero trust strategy, which I’ve been advocating for for the last five years.

But it’s really important that we deliberately change for the better, not change just because. Yes, we need to implement a zero trust strategy, but we also need to be looking as to what’s next. We have new transmission systems; we have 5G and at a certain point we’ll have 6G, so we need to be looking downrange as new technologies come in. We’re already seeing some practical applications of some nascent quantum computing for communications, but we’re seeing a lot of folks make advances in the amount of cubits and processing power with quantum computing. Similarly, artificial intelligence continues to grow rather quickly and that’s a big issue for things like deepfakes and some other things now that are becoming mainstream.

We need to be very, very proactive in taking measures that are going to better protect our data, our processes, the actual technology that underpins it, the supply chains and ultimately the ability to make informed and trusted decisions.

That’s really where we come in helping to harden that cyber ecosystem, and it’s exciting…right now with the traditional models that some people are continuing to use, offense has the upper hand. As we start shifting and leveraging the new models that we are creating up here and identifying those best practices, we hope to provide defense the upper hand in the short and long term future.

We’ve seen a series of pretty damaging software-based supply chain hacks over the past year. A lot of people tend to point a finger at the way we develop software. SEI CERT develops coding standards for different programming languages to bake in better security and resilience into the software development process. Can anything be done there to hold developers to a higher standard?

Our researchers have really been at the forefront of the security and secure coding procedures, the best practices and software reuse. Carnegie Mellon has put out some great research as well as practical advice to help combat some of the same things that were exploited with the SolarWinds breach.

When it comes to looking forward and where we are right now, we have a lot of folks that aren’t necessarily following best practices that we’ve already identified. Execution has always been an issue in every family and every organization, but we’re going to continue to go out there and identify the state of the art, the best practices and looking over the hill at what’s coming, not just what’s in our windscreen.

I think right now, we serve a rich source of best practices in secure coding. We can help organizations see what’s in your code, we promote concepts like the software bill of materials…in federal contracts so that we have better visibility into the different components and can look at changes in code base. I think this is going to be magnified as an issue as we look at supply chain risk management, and we’ve already been working on that for years now. So for organizations who want to look or want to learn more about how to better secure their software supply chain, we’ve been in that business and we’re working closely with our partners at DoD, the Department of Homeland Security, and across federal government and with industry partners as well to identify and promote those secure coding standards.

You also talked about the potential for automation. That’s something we’ve seen a lot of advertising around for technologies like endpoint and extended detection and response platforms. Do you see automation technologies as being one of the ways we solve or mitigate some of these problems?

Ultimately what we try to do in our line of work is make things simpler for the users as well as the operators and by users I define that as the end user. I may be on my mobile phone or on my laptop, but the operators are the ones who have to configure it on a server. Ultimately we want to optimize the system, make sure that the system is trusted, it’s reliable, it’s verifiable and auditable. I’d also add affordable.

We’ve seen in steps forward in technology that have been incremental, some have been tremendous leap frogs forward, and we’re going to continue to see that. But when it comes to a lot of the those different capabilities [through automation], one of the big  concerns that my friends and I have is the fact that everything is reliant on high quality data coming in, and that’s really where the security teams comes in, as we look at DevSecOps. We want to make sure, "does that operate the way it’s supposed to?" And oh by the way, we want to make sure that there’s no data poisoning, that the data is protected from creation to consumption to disposal, through the whole lifecycle of the data. What our research has shown is that it’s critically important to consider that whole lifecycle not only of the system but the data as well. Particularly with AI and machine learning, there’s a good body of research that reinforces the notion of “garbage in, garbage out” and that presents some very special challenges, particularly with highly integrated, complex systems where you’re taking data from all sorts of different sensors and fusing it all together into some sort of decision support system.

As Scotty [from Star Trek] said, the more complex you make it, the easier it is to break it. What we’re finding is that those folks that are manufacturers, those folks who are in research like we are, we’re looking for those best practices that are going to produce the best results. Automation has been moving forward and will…continue to accelerate the capabilities of national security and national prosperity. So that’s why it’s critically important to have teams like ours to go out and make sure that we’re optimizing our investments, that we’re doing things like DevSecOps correctly and that we’re promoting the best practices out there.

Derek B. Johnson

Derek is a senior editor and reporter at SC Media, where he has spent the past three years providing award-winning coverage of cybersecurity news across the public and private sectors. Prior to that, he was a senior reporter covering cybersecurity policy at Federal Computer Week. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.