Our aim is to engage with vendors as closely to an actual customer as possible. If a free trial is offered, we take it. If it is necessary to first engage with sales and request an account, we use the contact options provided on the website and wait for a reply, even if we already have contacts at the vendor.
However, while we engaged like a prospective customer would, we made no attempt to hide our identity or intentions at any point. We use real names and identify ourselves as employees of CyberRisk Alliance. We’re clear from the very beginning that we intend to perform product reviews and publicly publish the results. No compensation is requested or accepted for any of our reviews.
CyberRisk Alliance monetizes product reviews by licensing product reviews for redistribution after they have been published (commonly known as “reprint rights”). It does occur to us that positive reviews are more likely to sell reprints. We believe that enough vendors are interested in an honest, independent, and unbiased review that we don’t have to worry about making everyone happy. With that said, our reviews will be as polite and fair as we can make them.
We try to establish testing methodologies and share them with vendors before testing begins. However, it isn’t always possible to make testing methodologies available to vendors with new categories. It’s necessary to spend some time with the full range of products to understand the bounds of the categories and how to measure their performance. On the topic of performance, our reviews intentionally highlight product features and the customer experience over technical performance. We believe that technical performance, while important, shouldn’t be the focus at the expense of other product attributes.
Finally, vendors are given an opportunity to review drafts before publication. The purpose of this is to ensure the content of our reviews is factually correct, fair and doesn’t include any information protected under NDA. We are clear to vendors that this is not an opportunity to insert marketing copy or rewrite our reviews. Any attempt to do so is ignored.
Attack Surface Management testing
Most attack surface management products require very little input to start the process. We provided each ASM vendor or product with seven domain names, asking that they create the account and kick off discovery as they do for every other customer. If the process is something they kick off, we had them do it. If they had a POC kickoff briefing, we attended that briefing. In cases where hands-off free trials were available, we handled as much as possible by ourselves, unless support was needed to address an issue.
As a basis of comparison, we used the community edition of Maltego and a few other common OSINT tools to create a baseline for these seven domains, much like an offensive security consultant might do during an OSINT assessment. Approximately two hours was spent manually gathering OSINT data with these tools.
Due to this being a new category we knew little about, we focused our time on understanding the market, how to categorize it and exploring each product’s set of features. While we see opportunities for some performance testing, the results would be difficult to compare in a meaningful way. This is due to the lack of feature parity across vendors, which is unsurprising, given the relatively young age of the market.
Instead, our reviews will contain less testing and more explanation of how these products work and compare with one another. We do have some testing metrics that are generally universal, and we try to apply to all products we review. You can read more about those below.
For all product tests, it is necessary to define a tangible “value” in order to derive some of the metrics we use to evaluate products. Ideally (for us), value would be defined the same for each product within a particular category. However, many products have unique features and key differentiators that may result in a different definition of ‘value’ from their competitors.
The value of ASM products is derived from a variety of sources, due to the variety of use cases:
- Provide a comprehensive inventory of publicly accessible assets
- Evaluate the risk represented by these assets, noting issues that should be addressed
- Prioritize any issues discovered
- Continuously monitor these assets, reporting any changes or new assets discovered
- Perform 1-4 with as little input from operators as possible (put another way, value can be measured as analyst time saved)
Time-to-value is a metric that describes the amount of time it generally takes to get a product from zero to fully deployed and producing value. The clock for this metric begins when the vendor provides access to the product (e.g. an account to a SaaS product or license key + software download).
Labor-to-value is a metric that expresses the effort necessary to keep the product at a level of performance where it is providing value consistently.
True Cost is a metric that expresses the total cost of a product, including capital expenditures, operational expenditures, and labor costs. It is effectively product cost + initial deployment cost + maintenance costs, where the following labor cost assumptions are used. We’ve listed salaries along with the actual cost of the employee to the employer, based on the US Small Business Administration’s most conservative estimate (1.4x of salary). We calculate hourly rates by dividing the actual cost of the employee by 2080 hours (52 weeks multiplied by 40 hour work weeks).
- Junior Security Analyst Salary: $50k USD ($70k) – $33.65/hr
- Security Analyst Salary: $75k USD ($105k) – $50.48/hr
- Senior Security Analyst Salary: $100k USD ($140k) – 67.31/hr
To use this in an example, a 1-hour meeting with two senior security analysts and two junior security analysts costs their employer $201.92.
Other metrics considered
- Account setup process
- UI/UX navigation
- Time to discover asset information (some products require a day or two, while others return results in real-time from an existing database)
- Accuracy of results
- Usefulness and quality of reporting and dashboards
- Integration options
- API functionality
Below is a list of product reviews conducted under this category. We recommend reading through the Attack Surface Management Overview in its entirety before digging into individual reviews, but knowing which offerings were evaluated may offer helpful perspective as you do so.