Digital forensics is just one of many fields that have posed the question, “Is there a tool that provides universal functionality?” There are multipurpose tools that attempt to do just that, but there will always be situations that cannot be handled by a single tool.
The analysis of petabytes of data in real time, and the use of mobile devices, are just two examples of unforeseen situations to which digital forensics has had to adapt. Though the use of one tool that applies itself to any situation would no doubt seem convenient, it could not possibly account for these types of unforeseen advancements. Furthermore, multipurpose products could burden users with many tools they may never have cause to use, as different fields of digital forensics place a vast range of requirements on the tools being used.
For instance, law enforcement places an emphasis on tools being able to produce consistent results rapidly without altering the evidence. Network administrators require tools that allow them to detect possible indicators of compromise on their networks and analyze their systems accordingly, without compromising the utility or integrity of their production environment.
This demonstrates the two truths of digital forensic tools. First, that all professionals require solutions that consistently provide actionable data and, second, that no single tool can perform all required functionality. So, until there is such an offering, forensic toolkits will consist of a multitude of tools that professionals find applicable to their respective fields.
This review assists professionals in developing these toolkits by providing a look at the strengths and weaknesses of a number of essential products. A select few have become an industry standard, such as those that offer disk analysis and network surveillance. The remaining products are specialized tools for mobile devices, data collection, peer-to-peer, live system analysis, advanced search tools, analytical/visualization tools for large data sets and remote forensics.
These products were reviewed based on factors ranging from their ease-of-use and performance to their support features and documentation. As usual, the products in this Group Test were surveyed independently of one another and were judged solely on individual merit derived from whether they possessed the functionality that they advertised.
Not all the products reviewed may be appropriate for your organization, but some may make great additions to your workbench. The key is to select tools based on one’s needs and to form criteria that can be used to evaluate the products.
For example, your organization may be interested in analyzing live systems. This would logically motivate you toward certain products, such as network surveillance and network-based live system analysis. Identifying the categories of tools required to complete your objective is the first step in the selection process, but it is important to take into account more than just the general purpose of the tool.
Approach each product purchase with a clear understanding of your needs and the functionality that the product offers. This will allow you to find a tool that fulfills your needs and that provides the level of performance that you require. It is also important to find a tool that doesn’t burden you with unneeded functionality while not limiting the potentially useful purposes.
The tools in this review vary greatly, each covering its own area of digital forensics. They are all advertised to perform efficiently in their respective tasks and they have lived up to this hype. They do not promise to complete all possible forensic tasks, but instead propose to assist users in completing a focused task as optimally as possible.
Because the tools in this group vary so widely, you will note that there is more than a single SC Lab Approved and Recommended marking. Of all of the product groups we consider over the course of the year, forensic tools are, unquestionably, the most diverse.
Nicholas Logan is a graduating senior in the Computer Security Information Assurance department of Norwich University.
The following students at Norwich University performed the testing for this article: Jacob Berry, Tomasz Bochenek, Amanda Brown, Kevin Durgin, Jacob Evans, Robert Gallagher, Anthony Genco, Anthony King, Daniel Krasnokucki, Vincent Lally, Eric Patterson, Lisa Phillips, Brian Von Hone and Adam Wolfe. All testing was done in the digital forensic laboratory in the Norwich University Center for Advanced Computing and Digital Forensics under the supervision of Peter Stephenson, director of the NUCAC-DF.