Compliance Management, Privacy

Privacy matters

Some would say the Fourth Amendment started to fade on the morning of Sept. 12, 2001, when lawmakers, and to some extent the American public, gave national security implicit permission to usurp privacy. And it has continued its vanishing act ever since with every unwarranted government data request leveled at a private company, unauthorized stingray device put in play and dubious cell phone search executed.

Since the fall of 2001, when Congress hastily passed the USA PATRIOT Act, and particularly after Edward Snowden lifted the veil on the extent of the National Security Agency's PRISM program (which exposed massive government surveillance of American citizens), privacy has become a much-talked-about moving target, difficult to define and even more difficult to protect for those in private industry and government charged with its care.

“The interesting thing is how things have changed in the last two years,” says Hilary Wandall, chief privacy officer (CPO) at pharmaceutical company Merck, who notes that her business “doesn't have the same obligation under the Constitution” as other companies, like Microsoft or Google.

Wandall, who has served in a privacy capacity for 15 years, says that early on, privacy was driven by regulatory compliance, which became more sharply defined as the Health Insurance Portability and Accountability Act (HIPAA) was further refined. But now, Merck, like other multinationals, must contend with a host of privacy issues, such as “how to move data across borders” and comply with European Safe Harbor laws, Wandall says. “We're very concerned about how to legally move data overseas, trying to globalize databases.”

That is particularly important because Merck, like all pharmaceutical companies, must have a mechanism for doctors and patients to report adverse reactions to drugs. And, that information needs to be collected and shared without revealing certain private details.

In addition to intrusive government requests and stricter policies overseas, privacy is a puzzle that has become more complicated as an avalanche of data – generated by everything from corporate servers to wearable devices, like Fitbits and smart watches – gain both mass and momentum. There is a certain opaqueness to the way that information is shared as well, in part because people, companies and organizations are more interconnected through technology than ever before.

When data leaves a computer or personal device, its path is not straightforward and linear. Rather, a web of interconnectedness among people, companies and machines means information is shared seamlessly (most of the time) among entities, with the implicit, if not explicit, permission of consumers and businesses that want – and have become accustomed to – the benefits of that fluid information flow.

And, it is fair to expect that data volume and the speed at which it travels will pick up as the Internet of Things (IoT) comes to bear.

While information flows behind the scenes, the individualized user experience has spawned as many definitions and concepts of privacy as there are inhabitants of the planet. Technology may be the great equalizer, but it doesn't bridge the privacy divide between generations or cultures. In fact, it likely widens the gulf. Just ask a Millennial who has been weaned on social media, or a Baby Boomer – who remembers when clouds were just those awesome fluffy shapes in the sky formed by water droplets – to define privacy. The differences in their answers are staggering.

But if digitization has grown seemingly faster than the speed of light, our legal system has not. Digitization and tech advances have outpaced our ability to manage and protect data. Privacy officers, law enforcement and government are working within a set of laws that by the most generous definition are antiquated, with progress, if any, hampered by a chronically stalled Congress.

And then there are the cyber thieves, whose bold intrusions into the systems of companies like JPMorgan Chase, Sony, Target and Anthem that most would (rightly) figure had invested gazillions of dollars in security measures, have exposed the private information of tens of millions.

“The number one challenge to privacy is the wantonness of [stealing] personal data being carried out by the millions by cybercriminals,” says Larry Clinton (left), president of the Internet Security Alliance (ISA), contending that the constant stream of attacks is a more serious issue than the NSA's surveillance activity.

While privacy advocates might disagree about where to put the emphasis, security threats and privacy issues share a symbiotic relationship.The security breaches that started happening in the 21st century, “changed the paradigm for companies managing data,” says Merck's Wandall, especially in the last couple of years where “much greater connectivity, devices like smart phones, wearables and all of these apps” emerged and created “a complex ecosystem.” All that has changed the way we have to view privacy, she explains.

The great leap forward

If digitization is the hare, then the rest of society are the proverbial tortoises, moving so slow that it might take time-lapsed photography to detect progress.

“A first grader can easily access things from all over the world,” says Clinton, noting that the lure of easy information flow “is so seductive that we haven't thought through how to manage the downside.” But, he says, “We are so busy eating this delicious dessert that we're unhealthy.”

J. Trevor Hughes, president and CEO of the International Association of Privacy Professionals, concurs. “Technology advances are exceeding our ability to manage,” he says. “There is an unprecedented gap between our ability to develop standards and the bleeding edge of technology. With the rise of social media, he adds, “we don't know what the social norm is.”

The private sector, government and consumers have all benefited from the ease at which information flows, but are just starting to see consequences. “People expect everything to be connected, but at some time don't recognize that hundreds of organizations are in their ecosystems,” says Wandall. “They don't necessarily want to understand the complexity and they don't understand the risk.” That is, until something happens. “Then they begin to realize that this is a very complex world and bad things can happen.”

IBM has said that 2.5 exabytes of data was produced each day in 2012, while the BBC reported that 75 percent of it was unstructured. That is a lot of data, moving fast, with little or no management in place to handle it.

“All the digital footprints we're leaving tell a story,” says Malcolm Harkins, vice president, Intel Security Group, and chief security and privacy officer at Intel.

And, adds Matt Prewitt, partner, chairman of cybersecurity and data privacy practice, Schiff Hardin, “There is a proliferation of information online that vastly exceeds capability of law enforcement to handle.”

And it's only going to grow and move faster and leave more digital trails in light of the emerging Internet of Things (IoT) when products – like refrigerators, toasters and cars – all contribute their “voices” to the chorus of data on the internet, leaving companies grappling with how to respect consumer privacy.

“How do we know where data is?” asks Wandall. “What are people's expectations of how we use data?”

Wandall says that consumers often authorize the initial use of their data, but “they don't expect the secondary and tertiary uses.” The expectation is that information go no further than their initial authorization.

Harkins points to a Harris poll in which “we found 60-65 percent have no idea who has access to their data and a little over half wouldn't be able to show if it was to society's benefit.”

That struggle gets is reflected in the law itself. How data can be shared is “something not addressed very well with current legislation,” says Wandall.

What's behind you? It's the law.

Don't look now, but you're being followed by the law. While those words are rarely comforting, they're doubly insidious when it comes to privacy, applying both to law enforcement's questionable tracking practices and to privacy legislation itself, which has seriously lagged behind the great leaps in data generation and flow that tech advances have wrought.

Current guidelines for privacy are mostly drawn from the Electronic Communications Protection Act of 1986 (ECPA), a forward-thinking law for its time but now outdated enough to be considered if not irrelevant, then an obstacle.

“The prevalence of email and the low cost of electronic data storage have made what were once robust protections insufficient to ensure that citizens' Fourth Amendment rights are adequately protected,” Sen. Mike Lee, R-Utah, said in February when he, Sen. Patrick Leahy, D-Vt., and two other lawmakers offered up an updated version of the act for Congressional consideration. Still, we've clung to the old act as if it still packed a wallop. Think aging Boomer who recounts the glory days of high school as if they have the same meaning today that they had 30 years ago.

“Privacy statutes are stale,” says Donna Wilson, co-chair of the Privacy & Data Security and Financial Services Litigation groups at Manatt, Phelps & Phillips. “The ECPA is being used in ways no one would have imagined,” she says.

Congress last year had the opportunity to update the flagging legislation but took a pass, “despite being co-sponsored by a majority of the House,” points out Nathan Freed Wessler, staff attorney with the ACLU's Speech, Privacy and Technology Project.

Now, the legislation has resurfaced in the bill sponsored by Senators Lee, Leahey and two others that would “fix” a troubling loophole regarding access to stored email in the original act, says Wessler, but the House can't get it to the floor for a vote. Last year's rebuff and this year's stall both have come as a disappointment to privacy advocates and industry, both of whom are seeking serious privacy guidelines.

Our experts:
Privacy matters 

Larry Clinton, president, Internet Security Alliance

Jim Halpert, partner, co-chair,
U.S. Cybersecurity and Global Cybersecurity and Global Data Protection practices, DLA Piper 

Malcolm Harkins, vice president, Intel Security Group, chief security and privacy officer, Intel; author, Managing Risk and Information Security, Protect to Enable

J. Trevor Hughes, president and CEO, International Association of Privacy Professionals

Brendon Lynch, CPO, Microsoft

Nuala O'Connor, president and CEO, Center for Democracy & Technology

Matt Prewitt, partner, chairman of cybersecurity and data privacy practice, Schiff Hardin

Hilary Wandall, CPO, Merck

Nathan Freed Wessler, staff attorney with the ACLU's Speech, Privacy, and Technology Project

Donna Wilson, co-chair of the Privacy & Data Security and Financial Services Litigation groups, Manatt, Phelps & Phillips

The ECPA is hardly the only privacy-related legislation before Congress and hardly the only one to stall out or get mired in debate. The hotly debated Cybersecurity Information Sharing Act (CISA), which  took a step forward in March by getting the nod from the Senate Intelligence Committee in a 14-1 vote, has been skewered by privacy advocates. Sen. Ron Wyden, D.-Ore., the committee's lone nay vote, said at the time that “if information-sharing legislation does not include adequate privacy protections then that's not a cybersecurity bill – it's a surveillance bill by another name.” 

Further, an initial draft of the Consumer Privacy Bill of Rights Act (CPBR) floated by the White House took hits from tech professionals, privacy advocates and government officials, with critics essentially claiming it didn't go far enough in protecting consumer privacy.

And, lawmakers spent months last year considering national data breach notification bills – with a good bit of time eaten up simply trying to define what constituted a breach. One bill languished in committee and failed to make it before Congress for a vote before the term ended in December. Part of the problem centers around lawmakers' now-renowned unfamiliarity with technology, a posture that they have either purposefully or unintentionally cultivated.

“One of the struggles is you've got a situation where you have people in Congress and judges who are not technologically literate,” says Wilson. “After the Hilary Clinton [email] situation, you have people coming out in Congress proudly declaring themselves as Luddites. “Having them frame legislation is funny and a little scary,” she adds.

In addition to being antiquated, current laws are uneven and unclear. “That inconsistency makes it a challenge,” says Brendon Lynch, CPO, Microsoft. “You're not sure what law applies to what data.”

And what might get Congress to move and bring them up to speed on today's technological realties? “They're going to have to have a Bork moment,” says Wilson, referring to former President Ronald Reagan's failed Supreme Court nominee, Robert Bork, whose video rental history was leaked during his confirmation hearings, spurring the creation of the Video Privacy Protection Act of 1988. “Then they'll have more desire to be educated – once information about a high-level person is leaked or gets stolen or accessed. Just  look at how fast the VPPA got enacted when Congress realized people could get their video history.”  

That moment may come sooner than anticipated in light of Edward Snowden's traipse through NSA systems, and recent revelations that hackers have occupied systems at both the State Department and White House for months, at the very least accessing information about President Obama's schedule. With the political climate so hot, some operatives, or their followers, might be open to exposing private data on their adversaries.

Or maybe Congress simply needs a nudge from the executive branch, which lately has come in the form of Executive Orders and presidential declarations meant to move legislation forward that protects data and the privacy of U.S. citizens. In a flurry of pen strokes, Obama has called for national breach legislation, an information-sharing bill and CPBR.

All have indeed met with some criticism. The national data breach efforts particularly have been assailed for the notification burden on business. “I understand you have to draw some line in the sand in the timing mechanism [for breach notification], but you have to have an escape hatch” to allow for exceptions, says Wilson. Plus, critics point to vagaries, potential loopholes and enforceability. Information-sharing and CPBR initiatives, likewise, have been flagged as too vague, lacking in the specifics that would make them useful and doable. 

Federal Trade Commission Chairman Edith Ramirez, whose agency has been the de facto “beat cop” of privacy enforcement, recently characterized such proposed actions as the CPBR as a good starting point for establishing privacy guidelines and drawing boundaries. In March, she told an audience of privacy professionals at the IAPP Summit in Washington that she views the bill as “a discussion document” meant to stimulate the creation of a more robust law. 

That Congress get legislation right is more critical than ever both to better protect the sensitive information of consumers and business, but also to curb cybercrime. The onslaught of data and the lag in public policy have created an attractive little gap for criminals to slide in and do their dirty work. Uninhibited by regulation or legalities and with the same tools available to them as to the security pros, plus incentive, criminals have the time and impetus to create new attacks on privacy.

Criminals are outstripping the good guys, just as Bonnie and Clyde once stayed ahead of law enforcement by using automobiles to run across state lines, says Clinton. Today's cybercriminals are able to similarly avoid the long arm of the law by hiding out in places like Albania and Russia as they're stealing information from servers in California. “Law enforcement has slow cars,” he says.

Fourth Amendment

Until recently, it seemed that this country had skipped merrily from the Third Amendment, which restricts the quartering of soldiers in private homes, straight to the Fifth, which any viewer of Law & Order knows has become synonymous with keeping your mouth shut when a zealous prosecutor asks incriminating questions. The Fourth, it seemed, had beat a hasty retreat as “national security” concerns repeatedly trumped it, justifying everything from government spying to law enforcement's unwarranted digital “searches.”

“We're certainly concerned from the government standpoint,” says Nuala O'Connor (left), president and CEO of the Center for Democracy & Technology, who notes that “the threat of terrorism is real, but you just can't go around surveilling everything.”

But that is changing and as details emerge about government spying and law enforcement's increasing reliance on devices – like Stingray devices, which collect data not only on a potential suspect but also anyone in the vicinity – the Fourth Amendment has started to come out of seclusion, though maybe not quickly enough to curb violations and for the privacy advocates who, in a flurry of motions, amicae brief and public statements, have made it clear the amendment's protective authority is in jeopardy.

“In the U.S. we tended to view corporations as more abusive of privacy than government,” says Larry Ponemon, president and founder of the Ponemon Institute. “I think that's shifted with WikiLeaks and Snowden; there's a belief that government has stepped over a line.”

The ACLU's Wessler wrote recently in a blog that documents obtained (after much stalling) under Florida's Sunshine Laws about Stingray use there, “paint a detailed picture of police using an invasive technology – one that can follow you inside your house – in many hundreds of cases and almost entirely in secret.” Those documents showed that law enforcement agencies often did not have warrants for that type of data collection. And New York sheriff records revealed persistent Stingray use without court orders between May 1, 2010 and October 3, 2014.

That law enforcement and government toe a difficult line between ensuring public safety and protecting privacy, sometimes crossing over it, is nothing new. But technology extends their reach and makes access and gathering large volumes easier.

“Digitization created more opportunities that [the NSA community] had never had before and the tendency was to use it,” says ISA's Clinton. That was a “mistake” that he believes government has recognized and taken steps to correct – with a lot of encouragement from privacy advocates and the public, who despite growing up on James Bond, seemed shocked that the U.S. government would be spying even on its allies. “Spooks have been spying on each other forever,” notes Clinton.

The bigger shock came from the revelation that the government had been collecting data on its own citizens. Also, Snowden revelations not only showed the extent to which the government has been putting pressure on private sector companies, like AT&T, Microsoft and Google, to produce information, often without warrants, on its customers, but also, some say, hinted at how readily corporate America has shared information in the past with the government.

Also, Snowden revelations not only showed the extent to which the government has been putting pressure on private sector companies, like AT&T, Microsoft and Google, to produce information, often without warrants, on its customers, but also, some say, hinted at how readily corporate America has shared information in the past with the government.

At the IAPP Summit, a number of speakers pointed out that there used to be advantages to corporations for sharing their data with government, but now that's no longer true. What is seen as complicity has cost companies' reputations and eroded customer trust. It likely will hurt business, with some positing that customers looking for cloud services might turn to vendors outside of the U.S. in countries where privacy protections are more concrete and strict.

Whether due to economic impetus or an ethical commitment to privacy protection, or both, companies are beginning to publicly fight back and take a stand against these requests.

Microsoft is currently locked in a court battle over the U.S. government's efforts to obtain customer email stored on a server in Ireland – after the government tried to act on a search warrant to get to the content of emails on a server in Dublin. The non-content data was stored in the U.S. Microsoft refused, but a federal magistrate ordered the company in April to turn over the emails. The tech giant then filed an appeal in a federal district court, but the judge sided with the government.

Microsoft appealed that decision to the Second Circuit. Along the way, the software giant has been held in contempt of court and challenged court findings, but it has also drawn support from privacy organizations and business rivals, like Apple, AT&T and Cisco, which can probably imagine being in the same legal boat. The Electronic Frontier Foundation (EFF) filed a brief contending that when copies of the email were turned over to the government, that would constitute “Fourth Amendment seizure,” since the user would lose “the ability to decide who gets the messages and for what purposes.” 

Microsoft, to date, shows no sign of backing down.

What to do?

Given the vastness of data collected, the interconnectedness in society and the lags in laws, are the Fourth Amendment and other privacy conundrums even solvable?

“We're not going to be able to cure the disease,” says ISA's Clinton. “What we're really trying to do is build an immune system for ourselves.” While Clinton notes that in the last few years the U.S. is moving toward greater privacy protections, he says the progress has not been fast enough for his taste.”

The antibodies that will strengthen corporate privacy must come from all quarters – legislative, regulatory, judiciary, technological and corporate initiatives, as well as, to a large extent, the consumers themselves. The path to remediation must include “a comprehensive program to harden infrastructure,” That's largely owing to the fact that the “internet, which is designed to be vulnerable, is designed to be open, inherently insecure and becoming less secure because of Internet of Things,” he says. Owners and operators of systems, too, must be educated in core hygiene and encouraged to adopt “rudimentary best standards and practices,” he adds.

Progress in protecting the privacy of the public and corporations likely depends, first and foremost, though, on a shift in perspective. Companies, even those without constitutional obligations, must begin to view themselves as data companies. “Almost every company is becoming a tech company or has a reliance on it,” says Harkins.

As proof, he points to a cement company that is putting sensors in cement so that they can be used to gather information on traffic, infrastructure erosion and the like.  

And, Wandall notes that Merck doesn't hold the kind of data typically requested by government. It has to share information with government on adverse drug reactions and under fair employment requirements. The company realizes, too, that it could run into Fourth Amendment issues if, say, “employees are surveilled,” she says.

In addition, companies need to shape their own policies around protecting privacy and constructing strong privacy organizations within their own company walls.

“You have to start looking at privacy through an ethical lens, using information only the way consumers want it to be used [trying to] maximize the benefit while minimizing the risk,” says Wandall, establishing “sustainable benefits of use that are respectful.”

Microsoft remains “focused on customer trust and customer trust is at center” of its privacy policies, explains Lynch, who says the company “has invested heavily in privacy for many years.” 

He notes, “The best way to implement privacy is by design,” and that's how Microsoft has built its privacy organization—currently more than 200-strong. Lynch says privacy at the company includes a set of policies and procedures. “We have privacy officers embedded and ensure as we implement new products and processer that we keep up with standards,” he says.

It would behoove companies, too, for security and privacy teams to work together more closely. Security, a more IT-oriented discipline, “can be tone-deaf to privacy,” says Harkins. “Security and privacy need to think of themselves more as magnetic bonds” that create a stronger force when they work together.

Microsoft also advocates “transparency, control and security around the gathering and use of data.”

Merck, says Wandall, also tries to be transparent by providing comprehensive privacy notices to its stakeholders, explaining information it shares.

Similarly, companies like Google, AT&T, Twitter and, more recently, Snapchat, are keeping their own customers information via revised – and some would say more clarified – privacy policies.

In addition, to more specifically address the Fourth Amendment quandary, companies are releasing transparency reports that list government requests for data – which are significant. For instance, transparency reports show that global governments made more than 34,000 requests for a Facebook report that showed that requests from global governments increased by around 24 percent in the first half of 2014 over the last half of 2013. And Twitter showed a 40 percent jump for the same time periods.

While the reports offer some insight into government requests, privacy advocates and the companies themselves believe that they should provide more information on “digital searches” to make the reports more useful. But the U.S. government, which only recently began allowing companies to more concretely report these numbers, still provide restrictions on how much they are allowed to share. Companies can only report NSLs and FISA order requests in ranges of 1,000, if the requests are revealed separately. If combined under the general national security request term, they can only be reported in increments of 250. Still, baby steps are better that no steps forward at all. 

Make laws, not politics

But private industry needs the help of legislators, who must move quickly and decisively to draw clear boundaries. Passing an information-sharing bill should be a no-brainer.

“Sophisticated players are sharing info already – it's a good practice,” says Jim Halpert, partner, DLA Piper, and co-chair of the law firm's U.S. Cybersecurity and Global Cybersecurity and Global Data Protection practices.

Halpert, who helped draft the “NACD Cyber-Risk Oversight Handbook” cautions lawmakers not to create legislation that will make that process more complicated, but says, “Overall, though, as the legislative process moves forward it is likely to meet middle ground that works.”

Noting the apparent goodwill in Congress around information sharing, he predicts, “If there's one thing that's going to get done, it is going to be this.” Some privacy “hold ups” have centered around fears that the government might use information shared in some other context. 

Likewise, the much-maligned CPBR might gain traction if interested parties take the first draft as a first step. “With additional work, it could be useful,” says Wandall. “It's cross-sectoral [and] having legislation being considered that is cross-sectoral is beneficial.”

Unlike breach notification bills, “which  do not put the focus on privacy,” the bill of rights has the potential to address concerns of companies like Merck that “have to manage privacy requirements all over the world,” she says.

But, Wandall believes that the CPBR might have a more difficult time getting a thumbs up from Congress. “I'm not convinced it's going to move,” she says, noting that the “less complex” data breach notification initiative “is more likely to move.”

Here comes the judge

It is more than a little ironic that the Supreme Court, upon whose bench sits a panel of jurists whose average age hovers around 70, has made some decidedly modern, and surprising, rulings that expand privacy protections. Congress, where the generational gap (or, perhaps, simply mindset) has hampered both lawmakers' grasp of tech issues and the legislative progress, should take note. 

Though, in all fairness, the Supreme Court isn't under the same strictures as lawmakers, says Wilson at Manatt, Phelps & Phillips, because the Fourth Amendment is not trapped by a static piece of legislation. “The justices have the benefit of the common law and constitutional law and it evolves,” she explains. “It's much more susceptible to evolution.”

The nation's highest court is not alone in its efforts to bolster privacy protections. Although federal legislators may be spinning their wheels, state lawmakers have stepped up. While Congress has been mired in inertia, we've seen state legislators take decisive action, Wessler says. “Over a dozen states have passed bills on location tracking.”

That “real momentum from Maine to Montana” that the ACLU's Wessler sees, create a glimmer of hope that the Fourth Amendment might reassert itself and come out of the shadow cast by national security justifications and corporate interest. 

 

[sidebar 1]

The Supremes: Cell phone justice

“It has taken a very long time for courts to update how we understand the Fourth Amendment in the digital age,” says Nathan Freed Wessler, staff attorney with the ACLU's Speech, Privacy, and Technology Project. The Supreme Court has recently entertained a handful of cases – among them Riley vs. California, U.S. vs. Wurie, and Jewel vs. NSA – that have put Fourth Amendment issues squarely in the crosshairs.

Both Riley and Wurie deal with issues around the police searches of cell phones, while Wurie challenges the NSA's authority to tap into the internet backbone at an AT&T facility in San Francisco. In a brief filed in Jewel last October, the EFF accused the government of trying to go around the Fourth Amendment and notes that as government, “contends that if one of its purposes for the copying and searching the communications is foreign intelligence, then the circumvention is complete, and the internet has for all practical purposes become a Fourth-Amendment-free zone.”

Last June, the justices came to a unanimous decision on Riley, which stems from the 2009 arrest of a San Diego man, David Riley, who was convicted of attempted murder after police searched his smartphone. According to a 38-page court document, the justices agreed that “modern cell phones are not just another technical convenience,” but a device giving law enforcement access to far more information than other physical evidence carried on a person, such as a wallet or purse. “One of the most notable distinguishing features of modern cell phones is their immense storage capacity,” the ruling says. “Before cell phones, a search of a person was limited by physical realities and tended as a general matter to constitute only a narrow intrusion of privacy.”

Nuala O'Connor, president and CEO of the Center for Democracy & Technology, believes that that “personal boundaries extend to your online persona,” and expresses optimism about the Court's ruling. “They've consistently gotten it right,” she explains, with the Riley ruling saying that “a cell phone is like carrying [your] footlocker around with you.”

Calling the Riley decision the “intersection of new technology and old law,” O'Connor says the Court “applied them in a new context.”


[sidebar 2]

Privacy: In the driver's seat

Companies are beginning to train employees to recognize and evaluate government data requests as part of their privacy policies and awareness training. 

“Pursuant to our internal policies, we have procedures to review subpoenas,” says Hilary Wandall, CPO, Merck. “We only share that information we have to reply to requests.” To date, the company “has not run into any issues,” she says.

Merck also “tries to be transparent” about the monitoring it does of workers' computers to ensure that “employees don't break compliance” requirements.

Brendon Lynch, CPO at Microsoft, says his company informs its enterprise customers when it receives a data request. “We've never had to turn over our data to the government,” he says, contending that “it's not our place to be in the middle.”

Microsoft has put considerable time and resources into building a robust privacy organization that is often lauded as the gold standard. And, the company has boldly stepped out a number of times as a privacy advocate – and not only in its very high-profile court battle with the U.S. government. Two years ago, it made “Do Not Track” a default setting on its browsers (though recently the company changed that in an effort, it says, to give users more control over privacy choices).

“The first step you take, especially with the enterprise, is it's your data,” says Lynch.

That's a sentiment echoed by Kent Walker, general counsel at Google, who told IAPP Summit attendees that at Google, “users are in the driver's seat for setting privacy,” as well as visibility and other controls.

To what extent users will be in control of their data, though, remains to be seen since companies like Google and Facebook make considerable money trading on the information on customers and their preferences and habits.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.