Cloud Security, Security Program Controls/Technologies, Security Staff Acquisition & Development

Orca Security to offer first ChatGPT extension for cloud security purposes

Artificial intelligence display
Orca Security is believed to be the first firm to announce an extension to process security alerts by way of the ChatGPT artificial intelligence technology. (Photo by Andrea Verdelli/Getty Images)

Israel-based Orca Security announced it was offering a ChatGPT extension to process security alerts and offer users with step-by-step remediation instructions. 

In a Friday blog post, Orca security said customers can use the new GPT extension to remediate findings in many ways, either via the command line, infrastructure-as-code (such as Terraform or Pulumi) or the console.

Melinda Marks, a senior analyst at Tech Target’s Enterprise Strategy Group, said Orca is the first announcement ESG has seen of a ChatGPT extension or plug-in by a cloud security company. Marks said Orca aims to alert security teams of issues that they need to fix to mitigate their risk, so they have integrated the security alerts and data from their platform with GPT3 to generate the information on how to remediate the issues.

Marks said the challenge for cloud-native security has been too many alerts without the ability to remediate them in time because it takes so long to prioritize them and determine the needed remediation actions, so there is a high number of security incidents. Orca’s move promises to help customers support cloud-native development with faster remediation actions for security issues so they can mitigate risk to prevent incidents.  

“Vendors are exploring how to use AI and these types of technologies to reduce manual, time-intensive work, which is important when you are trying to best utilize staff and resources when we face a cybersecurity skills gap,” said Mark. “Whereas in the past they may have been reluctant to utilize AI, ML or automation out of fear that it could break something or reduce their control, they are getting more comfortable with it as the technology improves. ChatGPT is useful for SecOps because they are trying to speed their detection and response capabilities, so we can expect to see more of this type of technology used. People and vendors should view ChatGPT as a useful tool to leverage for things like faster research, faster coding, and it can help people focus on higher-value work.”

Frank Dickson, who covers security and trust at IDC, said Chat GPT has certainly excited the imagination of security professionals, allowing them to dream of what's possible.

“Let’s face it: cybersecurity is hard,” Dickson said. “Legacy tools require complex query languages to get value, necessitating a ramp-time for new analysts. The emphasis on the part of security vendors has been on low-code or no-code solutions: make tools easier and enable quick time to value. AI offers the ultimate promise of no-code interfaces to enable even the help desk to analyze security issues and possibly deliver secure outcomes. We have seen cyber miscreants leverage ChatGPT to create better phishing campaigns. We should also use the technology for good.”

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.