Price: On-premises or SaaS availability – details from Veriflow
What it does: Assures network correctness, based on mathematical verification of the entire network state using advanced algorithms and a predictive model of network behavior.
What we liked: Unique approach to continuous security verification.
The bottom line: This may be a new implementation of formal verification but the technique is tried and proven over time. That makes this one well worth your inspection.
Every now and again we see some very unusual and innovative approaches to solving a particular security problem. Occasionally one of those approaches takes us back to studies at university that we thought were pretty cool theoretically but we couldn't see a practical application in securing the enterprise. This only is the second one of those we've seen in recent years – the first used graph theory – and this one uses formal verification.
Formal verification is not particularly new. It long has been a staple of rigorous programming techniques. However, this is the first time we've seen it applied in this manner. Fundamentally, formal verification creates a model of the code and compares it to the standard model that the engineers have created to define the policies that the final code must implement. It is used to verify proper operation and find coding errors. In this product from Veriflow it does all of that and a bit more.
The starting point is the assumption that the network can be treated as a piece of program code. That enables the creation of a policy-directed model. The tool performs continuous formal verification and compares its finding against the benchmark model. By noting variations from the benchmark model, Veriflow can predict problems before they occur. By pinpointing vulnerabilities and attempted malicious activity based upon network state changes Veriflow alerts on issues before there is any impact to the enterprise.
There are three basic components of the Veriflow system. First, the collector – a virtual machine – captures all enterprise data, normalizes it based upon device vendors that it finds on the network. It passes those data to the verification engine where it is subjected to analysis by the engine's algorithms to create the baseline model. Once the mathematical model is created, the verification continuously processes network data against it looking for anomalous state changes. These it reports to the third component, the visualization dashboards for analyst action.
One question we had related to the result if the baseline was taken across a system that already had flaws or an intruder inside the system. First, the system is set up to support a broad collection of business objectives. This is referred to as “intent-based networking”. Because no network traffic is intercepted – only configuration is analyzed. The configuration equates to the code in a program and is, therefore, subject to formal verification. If there is a miss-configuration in a device the system recognizes it as violating the intent of the device. For example, is a routing table consistent with the enterprise's intent? Has there been some change – possibly by an intruder – that violates the intent of the enterprise? The result is that huge networks can be analyzed very quickly without resorting to packet capture and analysis.
We expected the user experience to be very complicated for such a sophisticated tool. It's not. Clearly, the developers have spent a lot of time and effort making the underlying power of Veriflow accessible to analysts who have no need to understand the math behind formal verification. We started with a very high-level picture of the network as seen by Veriflow and started drilling down. The deeper we went the more we learned.
There is a very useful function called an intent dashboard where we could ask questions and get answers. For example, on a large widely-distributed network we might want to see if the data flows between sites was what we intended it to be. For example, how does data get from Site A to our ISP? This results in a broad picture that we can drill into down to the Path Inspector which looks like traceroute on steroids.
Overall, we really liked this product. It added a dimension of geeky cool to , and the application. With PreFlight Verification you can do “what-if” analysis. Dynamic Diff allows you to compare two snapshots across time and hybrid infrastructures are treated as a single enterprise requiring no individual analysis of cloud, virtual and physical data centers.
The website is relatively complete – certainly there is a lot of information there – but we would like to see a support portal available to outside world. Given that, there are lots of white papers and other materials to help you understand their approach and intent-based networking.