As more regulators scrutinize the business practices of financial services companies, IT security pros must advance their data processes and safeguards, reports Illena Armstrong.
A fickle economic climate, whose implacable unpredictability has shown leanings toward another downturn in recent weeks, is compelling many information security executives to fret over their budgets even now. For those working in the highly volatile financial services industry, though, that's not the worst of the problems with which they're wrestling.
Information security leaders in that space, which has seen many of its longstanding players come under criminal investigation by various agencies for alleged fraudulent practices this year, are experiencing intensified scrutiny by regulators.The Securities and Exchange Commission (SEC) and other federal investigators have reportedly ramped up efforts to determine whether some banks committed fraud by misleading investors in the sale of risky mortgage-backed derivatives and other securities – practices principally blamed for spurring on the financial crisis. As federal officials continue to inspect the likes of Goldman Sachs and Morgan Stanley for alleged wrongdoing, many industry experts believe a rising onslaught of official queries, audits and more investigations will become standard practice in the financial services arena.
After all, these still-ongoing official efforts seeking to uncover information about the possible commission of crimes by Wall Street banks could reveal more than just fraudulent activities. Ultimately, investigations may confirm what some reportedly already believe to be the misspending of U.S. government bailout monies by other firms, like AIG. Allegedly, the insurance and financial services giant, using portions of its $182 billion government bailout, may have made excessive counterparty payments to some banks, including those in question, as a result of collateral trades that were based on now infamous shaky mortgage-backed securities.Already, then, the unwelcome norm of increased federal scrutiny in the financial sector is playing out in various ways, but the most troublesome for the majority of players in the space seems to be an increased number of legal and regulatory requests for huge amounts of information and confidential data, according to information security leaders who recently attended SC Magazine's Financial Services Roundtable held in New York.
“The number of requests that we got from regulators the first four months of this year – the number of correspondence we received from regulators – has been more than we received in all of 2007,” said a financial-investment-firm practitioner at the event, who wished to remain anonymous.As a result, regulatory audits have spiked, with the data required to fuel them having grown exponentially, he said.
“The problem we're running into is that [the regulators] don't often know what they're asking for, so they ask for just ridiculous amounts of data. When you're doing close to a million trades a day and they ask for trades for a month, [that's] a lot of data,” he added. “I would say the regulators are under increased pressure to show that they're doing something. I don't necessarily know if they know what to do, but they have to do something.”
Some of these demands from regulators or auditors for inordinate amounts of information may arise because of the complexity of the data and systems upon which financial institutions often rely, Warren Axelrod, former information security officer for Bank of America's U.S. Trust division, said at SC's Roundtable. Myriad complications can arise from the countless financially involved dealings or transactions firms engage in, so regulators seem to often depend on blanket requests for data in hopes they catch everything.“We've gotten ourselves into such a complex environment that it is virtually impossible to understand all the potential compromises or changes or occurrences,” said Axelrod, who is also the lead on the Financial Services Technology Consortium's Software Assurance Initiative. “Auditors probably understand less than the IT and infosec folks. However, they are under the gun because these things happened under their watch and they missed them. So I think a lot of what auditors tend to do if they're put in an embarrassing position [is] jack up the effort and show visibility.”
A major problem with even more intense regulatory pressure and increased requests for information from government officials is being able to actually provide the data sought. For many companies, this can be challenging on a couple of levels, explained Axelrod. First is the problem of instrumentation, and second is the issue of monitoring and reporting.
“It is OK to monitor and report, but you have to create the data. And in a lot of cases it just doesn't exist because the applications don't report it – the logs are not generated. Frequently, you don't know what's happening within an application because the application [just] isn't generating data,” he said.Calling out this year's so-called “flash crash,” which saw the Dow Jones Industrial Average plummet about 600 points on May 6 only then to recover losses within minutes, Axelrod noted that while Congress and various other federal agencies still are keen to understand the trigger for the unexpected plunge, they never may find any solid answers. It likely will be impossible to collect all the pieces necessary to find the cause of what has become known as the biggest one-day point decline on an intraday basis in Dow Jones history. In fact, the data required to discover the crash's root simply may be nonexistent. Further, because of limitations in data monitoring and reporting technologies, a tool's management dashboard won't be able to reveal the source behind the drop. So far and despite a continued investigation, no specific reason for the plunge has been found, though reportedly some investigators believe a convergence of problems associated with computer-automated trades or an error by human traders could be the cause.
To amass the right data to meet regulatory or auditing demands, there are really two sides requiring resolution, said Axelrod: “The creation of the data and the monitoring and the reporting. And I don't think we're doing a good job in creating it.”This is because many information security executives and their bosses have failed to figure out the kinds of data they'll actually need to address potential issues and, ultimately, drive necessary business decisions.
From a vendor perspective, no magic solution to tackle these specific problems exist, Rick Caccia, VP of products/product strategy for ArcSight, sponsor of the Financial Roundtable, told the group. However, despite the dearth of solutions, financial services companies are working out these information security-related problems and have progressed beyond many other industries, he added. Although many IT security executives in these organizations may have sought out products with an eye to finding a silver-bullet fix even two to three years ago, they're well beyond that now.Although Axelrod agreed, he believes there is still much to be done. “You have to try to anticipate the kinds of issues you're going to be looking into. What we do generally is [say,] ‘This is the information. How can we analyze it and generate reports that are interesting?' [But] that information isn't necessarily the important information.”
Working out the problem
This is where policy and practice, with support from technologies, comes into play, said another information security leader, who works for a large financial services firm and asked to remain anonymous. As she also continues to experience a bump in regulatory audits and information requests from government watchdogs, she and her team are taking measures to help guide the process.“One of the projects I'm working on right now to try combat it from the internal side is putting [together] our own assessments to kind of frame out what our environment is, where we know we have some of the vulnerabilities and, then, actually help lead some of the questions [from regulators] and channel some of those data requests into something more meaningful and manageable,” she said at the Roundtable. “Certainly, we're not there yet, but that's an intent because it is coming down this way. The faucet's open and we need to help manage that both for the [economic environment and] our own company.”
Underlying the processes of data protection against vulnerabilities and threats and the monitoring and analysis of it for ready access is the concept of business intelligence, she explained further. It is the long-standing theory of processing raw data into information that can ultimately be used as business intelligence to make decisions in support of core enterprise goals.
Key to becoming “more intelligent about how we're responding, where we're pulling the information from, and looking at [the process] from an end-to-end base,” is the data and how it is monitored and analyzed, she said. She and her information security team is focusing on processing the data to gain business intelligence so that they can better address overall information security needs and manage intensified regulatory scrutiny both internally and with their clients for a range of activities, such as transaction processing.“We've never really approached it that way, so right now we're still doing some of the platform architecture to kind of document it. And then [we'll] start to talk about where we can change it up,” she said. “We've done a lot of process re-engineering over the years, but never with this end result in mind.”
As organizations in the financial sector and beyond experience added regulatory pressure and, at the same time, seek out ways to cut costs when conducting business – such as enlisting cloud computing services – data security, integrity, analysis and monitoring will be paramount. Integral to these activities will be implementing the right policies, best practices and supportive tools. For another information security professional attending the SC Magazine's Financial Services Roundtable, who works for a global financial services firm and asked that her comments remain anonymous, much of this depends on streamlining processes.“It is a combination of having the strategy in place and a ‘Six Sigma' attitude about controlling the efficiency of the data and the way the data is flowing,” she said at the Roundtable, referring to a business management strategy that was originally developed by Motorola in the 1980s that still is in use today and aims to strengthen business and other processes by minimizing variables and identifying and removing the causes of errors.
“So, coming back to the complexity: Instead of creating a super-complex kind of streamlining, [it is a matter of] mapping how the data is going in and what is done to the data as it gets out and who's touching it [on both sides]. I think probably we're going to see more and more of that,” she added, noting that she and her team are pursuing this now.As for the solutions to support such efforts to streamline the monitoring and analysis of data, they're out there, said another security pro at the Roundtable, who works for a diversified asset management firm and wished to remain anonymous. But, really, what's most important to maintaining the security of and access to data is “common sense.”
ArcSight's Caccia agreed, noting that too many organizations turn to vendors in hopes that products will provide a quick fix. For the financial services market, that's changing primarily because of longstanding, stringent compliance mandates and the critical information upon which business successes or failures are based.
“In other industries, they're further behind and expecting [to] just spend some money – buy this or that product – and make the auditors go away. It is going to bite them further down the road,” he explained. “I think the key factor is how core is information security to the business. In financial services, it is the business.”