Content

Open File Solutions: Optimization for ‘Now’

It’s no secret that in today’s information age, data integrity and redundancy are critical success factors for businesses.

But against the backdrop of the September 11 anniversary and today's 24x7x365 environment, IT decision makers are now forced to review their backup strategies and disaster recovery plans with an even keener eye than ever before.

With global networks maintaining ever-increasing amounts of data in a new universal time zone of 'now,' IT administrators can no longer schedule backups in so-called 'after hours,' because those hours no longer exist. For most companies, there is no such thing as a clear window. Backups must be done when systems are up and running.

In short, enterprises today must optimize recovery solutions that offer maximum reliability, while also supporting full productivity and efficiency. But, in order to achieve both of these goals, companies must look for tools that address an age-old dilemma - preventing open file data loss during scheduled backups.

Disaster recovery plans are only as good as the strategies that support them. Without an open file solution, organizations risk losing significant amounts of data. In the end, companies must identify software that can reliably backup data while files are in use and while internal and external transactions are still going on.

It's All About the Bottom Line

The quantity of mission-critical information the organizations are storing, managing and maintaining on computers is growing at an exponential rate. According to the Hurwitz Group, by the year 2003, the average large company will store more than 150 terabytes of data - the equivalent to the storage capacity of approximately 240,000 compact discs.

Yet, many companies' backup strategies are missing one fundamental piece: the ability to ensure a full system backup without skipped, corrupt or unsynchronized files. The impact of this shortfall is enormous, with annual costs due to lost data estimated at more than $11.8 billion, according to a Pepperdine University published report.

Open files have historically been a network administrator's worst nightmare. Traditionally, during a backup, files that are open or in use get skipped. Most programs will try to access them again at the end of backup, but if the files are still open, they will be ignored, and critical data will not be protected. The alternative - forcing open files closed or asking users to log out so the system can be shut down - is no longer viable.

For example, consider a company whose customer relationship management (CRM) system is linked to its accounting database. As transactions occur within the CRM database - such as new customer information being added - modifications are automatically written to the accounting database. In this way, data from one system is always present and current in the other.

If these integrated databases are backed up while files are open and transactions are still going on, the backup may capture a transaction that occurred in the CRM database but, because of timing issues, it may be forced to skip the related open or locked file on the accounting database side. If there is a subsequent system failure and the files need to be restored, they will no longer match up.

The bottom line: most organizations have a costly disaster recovery infrastructure in place to protect their critical data, but all too often these strategies fall short.

As a result, companies need an open file solution that consistently captures open and in-use files without interrupting applications or continuous system operation.

Generic Open File Solutions

There are two conventional ways to deal with the open file dilemma. One solution is to purchase an agent that works with specific applications. Another solution is to install a generic open file utility that provides a 'window' from the data in the open files to the backup software being used.

Dedicated agents are available for a handful of database and email applications. Typically, they are designed by backup application developers specifically to support open file backup of a single application's data using their backup program. Because of the direct integration, they provide powerful capabilities, such as object-level restores.

However, these agents are not cheap, and are not a cost-effective solution for most IT budgets, since each specific agent only works with its one defined application. For these reasons, they are fairly limited and expensive to deploy. A recent survey by the Disaster Recovery Journal cited 38 percent of respondents reporting that the biggest challenge in planning disaster recovery efforts was funding.

A better solution is found when looking at generic open file tools that are application agnostic - giving backup software access to open files across the board, regardless of the originating software application.

But Buyer Beware

In choosing a generic open file solution, there are several aspects to be aware of before spending precious budget monies. First, IT decision makers need to make sure the generic open file tools function regardless of the backup package being used - especially if an enterprise frequently changes versions or types of software, or if its primary backup package fails.

For example, consider a company that normally uses third-party software for backup purposes. If the company has deployed a generic open file tool that only works with that particular package, it would be unable to support other emergency backup solutions such as the standard Windows NT default utility. With a generic open file tool that works with all programs, the company could conduct a reliable backup even if its third-party software suddenly failed.

Second, IT gurus must make sure the generic open file solution is capable of providing system-wide synchronization. System-wide synchronization uses technology that dynamically creates a pre-write cache for data that is changing within all open files on the system once backup has commenced. Only changed data is stored in the pre-write cache, not entire files. When the backup software reaches a portion of the file that has changed, the open file tool will substitute the original (pre-write) data from the pre-write cache to complete the request. Then, the backed up data on tape will look exactly as it did when the backup process began, which ensures the most complete and reliable backup and restore.

Many open file solutions will include either a file-by-file or volume-by-volume method, or even require synchronizations to be conducted manually to perform this essential function. In doing so, the software has failed to live up to its promise of increased productivity, but rather, becomes another obstacle to true 24x7 reliability.

Think of it in Business Terms

For IT networks, there is, in effect, no designation of time except for "now." For corporations as well as their mobile professionals, continuous, round-the-clock access to reliable data is not just key, but imperative. To meet this new requirement, companies need to optimize their systems to include backing up files that are in current use by its workforce. The most cost-effective and timely solution for employers is to integrate a generic open file system.

When choosing a generic open file solution, corporations need to think of it as choosing an IT partner - because that's exactly what it is. Businesses without a reliable, flexible open file function incorporated in their backup strategies and disaster recovery plans will find themselves trailing behind their competitors who do.

April Nelson is the product manager for St. Bernard Software's Open File Manager. St. Bernard Software (www.stbernard.com) is a global provider of security solutions that protect against data loss, system threats and Internet abuse.

 

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.