Content

Forensics

Digital forensics is a field that never stops evolving. The continuing growth of electronic devices makes this a field in which there is a need for constant research. This R&D often results in new products suited for a specific purpose. The forensic tool market has been growing quickly in order to meet the needs of investigators. This year's forensics Group Test is largely composed of specialized products, with a notable exception or two.

Some investigators may not like the idea of having a certain specialized tool for each specific problem that needs to be solved. More tools means you could potentially spend more time in court explaining how each tool works, which most likely isn't anyone's idea of fun. It could also require more resources from your organization to acquire several different solutions.

However, there are also benefits to many of these specialized tools. Since they are designed for one specific purpose, many offer more robust options than an all-in-one solution. Similarly, many of these tools are very efficient at what they do, and therefore don't require the computing resources that some larger solutions do. That also makes them very well suited for field use.

Along with a couple of all-in-one solutions, we tested specialized tools for analyzing mobile devices, quickly and easily collecting volatile data, analyzing the use of peer-to-peer software, real-time network surveillance and reconstruction, and network-based drive mapping. It is likely that many of these products were created to answer an express need of some segment of the forensics community. The ability of these tools to complement the "traditional" forensic tool kit is exemplary, and can often speed up an investigation.
Of course, an examiner should always understand what's happening "under the hood" of the tools that they use. Testing products before using them in a production environment is always recommended.

How we tested
Testing these tools varied greatly due to the wide range of tools tested. The larger tools were tested in our standard test bed. Additionally, we were able to use our standard test bed for several of the specialized tools, with minor modifications. For example, we generated live traffic for use with the network reconstruction tool, and we installed various peer-to-peer (P2P) applications on some of our machines to test out the P2P detection tool.

As always, the tools in this Group Test weren't pitted against each other. Instead, they were rated based on how well they carry out the functionality advertised by the manufacturers.

Buying a digital forensic solution
For some organizations, making the decision on which tool to purchase next may result from whichever problem most recently surfaced. If an urgent case would benefit from the acquisition of a certain tool, then that's the tool the organization buys.

However, a more likely scenario is that you will need to go through a lengthy purchasing process. If this is the case, a cost-benefit analysis will help determine which product should come next. If your organization is mainly dealing with policy violations, a network reconstruction tool or P2P analysis tool may save time and money in the future.

Of course, this all assumes you already have a general forensics tool. If you don't, or you're in the market for a new one, then you have to prioritize your need for a specific solution, or the ability to carry out several different tasks with one tool. Most of the general solutions are very mature at this point. They have had several years to integrate new functionality, get court-tested, and become very stable.

The bottom line is that the acquisition of digital forensic tools requires a bit of thought. The decision as to whether you should go with a one-size-fits-all general purpose tool or add some specialized products will depend on your situation.

Besides this writer, the testing/reviewing team from Norwich University this year consisted of (in alphabetical order): Boulat Chainourov, Cory Cunningham, Cameron Davis, Eric Knopf, Gary Leavenworth, Katherine Ly,  George Maxfield, David Nicklas, Chris Pashley, Chris Swanson, Nick Talcott, Travis Tyler, Gianpaolo Wible, Emily Wivell, and Kevin Zittritsch.

All of these students, now graduated, performed the testing of the products assigned to them, as well as producing the reviews in the next few pages. All the testing was performed in the digital forensics laboratory in the Norwich University Advanced Computing Center (NUACC) under the supervision of Dr. Peter Stephenson, technology editor for SC Magazine and director of the NUACC.


Editor's note: An earlier version of this posting inadvertently left off the names of two reviewers: Chris Pashley and Kevin Zittritsch. Our apologies for the oversight.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.