There’s no question that SaaS has had a profound, positive impact on business. But it’s also created a more diffused perimeter and made attack surfaces a moving target to secure.
Our research team recently discovered the typical enterprise has more than 12,000 exposed web application interfaces, with approximately 30% containing at least one exploitable or high-risk vulnerability. So despite a growing security tech stack, most organizations lack visibility and control over their most sensitive assets.
But why are we still suffering from visibility issues?
Attack surfaces today are larger, more dynamic and complex than ever. According to our research, the typical attack surface fluctuates 10% every month. This constant change has become difficult or impossible for traditional security solutions to continuously monitor.
The result: most organizations have a large volume of exposed web app interfaces that are unmapped and unprotected. Our recent findings show 70% of exposed web app interfaces lacked the most basic protection like web application firewall (WAF) or an encrypted connection, particularly HTTPS. Furthermore, 74% of assets with personally identifiable information (PII) are exposed to at least one known major exploit and one in 10 have at least one easily exploitable issue.
Attackers know most web applications are vulnerable and under-tested. And they will find the path of least resistance to PII at all costs.
The web’s complex evolution
Back when legacy point solutions for web application security were invented, companies such as Apple and Google had just one internet-exposed web application. Today, they have thousands of networks and tens of thousands of web applications exposed in the public domain.
Taking a step back, web applications and interfaces are defined as anything that can talk to a browser. They are found throughout internal as well as external attack surfaces. Examples can include:
- Web applications built outside of the safe harbor of controlled environments, outside the CI/CD web dev process.
- Web interfaces that belong to subsidiaries, joint ventures, acquisitions and partners.
- Routers and DevOps tools, via management web interfaces.
- Internal applications accidentally exposed to the internet.
Shockingly, an enterprise might have 10K or 50K — or even more — of these web interfaces exposed to the internet. Many could contain serious vulnerabilities created early in the development process, such as subtle coding errors and corrupted open-source components. Some can also often be misconfigured when deployed, or suffer from gradual configuration drift, exposing them to attack.
All it takes is a single entry point. That’s why it’s critical to test all web interfaces frequently and thoroughly for security gaps. Until recently, that has not been feasible for most organizations.
The crown jewels are often hidden
Detecting all major vulnerabilities in an organization's web applications across a global attack surface has become quite the technical feat. Security gaps are often well-hidden, where legacy tools can’t search.
For example, standard endpoints sit in a known IP range. Web applications are often tucked away in a subdomain run by a subsidiary, collecting PII on customers.
With operational technology (OT), a web interface might not sit under a URL, but rather at a random IP address with up to 65,000 hosts. Some applications are buried at deep levels of web pages, or in unconventional locations in non-traditional architecture.
And beware of the weakest link: the smallest flaw in code logic can accidentally create critical exposures (SQLi, XSS and data exposure). Companies can invest $100 million in endpoint security and firewalls, but that won’t protect its web applications.
Vulnerabilities can even arise in a small subsidiary overseas, or simply from content that changes dynamically.
I can go on.
Detailed enumeration: assembling all the information about a system — is a way to find the more elusive web interfaces. The problem: legacy tools don’t do this, and attackers know it. Legacy web application security testing has not been up to the challenge
Traditional application security tools and processes (DAST, pen-testing, endpoint security, firewalls) that from the early 2000s are now outdated. Some of the problems are they:
- Don’t scale for today’s large, complex environments.
- Are difficult to configure properly.
- Take too long to test applications. It may take two- to four weeks to test a single web application. What if there are thousands?
- Can miss subtle, but dangerous flaws in code logic and software components.
- Miss vulnerabilities in web applications hidden across a global attack surface in locations that legacy tools don’t search.
- Drown AppSec teams in alerts. In actual use, only 1% or 2% of alerts from a standard scanner are true positives. DAST tools cannot validate most potential vulnerabilities; they do not factor in asset value and context or asset attackability.
Some security teams think “upgrading” or “retrofitting” these traditional tools will do the trick to solve this. Unfortunately, it’s not that simple.
Modern web application security testing redefined
Security teams can break down a plan of action into five phases: discover (map), detect, prioritize, attribute and remediate.
- Discovery: Map the attack surface to enable the next step.
- Detect: Conduct comprehensive testing, to detect vulnerabilities in web applications.
- Prioritize: Make this fast and accurate; in other words, automated. It requires automation to validate vulnerabilities, cull out the false alarms and distill the task list down to a handful of the most urgent true positives. Find the handful of “must-fix” issues by checking the value and reachability of assets in which a vulnerability could lead. Then, determine the type of asset: is it a payment mechanism, database, knowledge forum or a draft website? Make the high-value assets that have direct, easy paths to exploitation the top priority for remediation.
- Attribute: Assign web interface vulnerabilities correctly to the right owner accountable for them.
- Remediate: This remediation relies on prioritization so you can focus on protecting the most exposed high-value targets. It may entail negotiating with subsidiaries or partners. Here’s where hard data from your discovery, validation and especially attribution is very useful.
To boil it down, attackers are drawn to points of least resistance and high reward. Web applications are tempting targets because they often store sensitive information and they are difficult to discover, test, and defend.
Legacy tools and partial testing are not scalable or effective enough to handle this massive challenge. In fact, the traditional approach to web application security creates more risk and is very costly. Remediate threats by prioritizing vulnerabilities based on the context, purpose, and value of assets and their respective security gaps.
Business continuity and exposure management ultimately lie with complete discovery of all web interfaces. Organizations need complete, 24x7x365 visibility of web interfaces with accurate testing and prioritization of vulnerabilities, with correct ownership attribution to speed up remediation. Only then, can organizations truly protect their attack surfaces.
Rob Gurzeev, chief executive officer, CyCognito