The world of cyber crime and cyber law is buzzing this week with revelations about diagnostic software installed on some smartphones. Why? Because some deployments of the software, made by a company called Carrier IQ, may be handling personal data without the express consent of the data owners, namely the smartphone users. Some headlines have branded the company as “Phone ‘Rootkit’ Firm Carrier IQ “ and at least one senator has demanded answers.
For its part, Carrier IQ has demonstrated both the right and the wrong way to respond to a privacy incident. The right way? Promptly provide the public with clear and honest answers when privacy concerns are raised. The wrong way? Serve a perceived critic with a cease-and-desist order. That was Carrier IQ’s first reaction, from which the company wisely and, to its credit, very quickly, retreated. Of course, this change of heart could have been prompted by the intervention of the Electronic Frontier Foundation (EFF) on behalf of the perceived critic, Trevor Eckhart.
And it could be said that Mr. Eckhart himself went too far when he used the term “rootkit” in reference to Carrier IQ’s software. In my opinion, the software, at least as it appears on the HTC phone we have been using for research at ESET, fails one aspect of the definition of rootkit one finds on Wikipedia: “software that enables continued privileged access to a computer while actively hiding its presence from administrators by subverting standard operating system functionality or other applications.” The software I’ve seen is not “actively hiding its presence,” although you could argue it is subverting standard operating system functionality because you can’t turn it off or uninstall it.
Definitions aside, this software is causing a lot of concerns. The extent to which those concerns are justified is hard to determine as we don’t yet know how many phones have been collecting how much data for what period of time, or what exactly has been done with the data. While Carrier IQ is now being more forthcoming, it is still not clear whether the company fully grasps that the purpose for which data is acquired is not, in terms of privacy and security, relevant to the uses that could potentially be made of such data.
For example, if a company asks for one’s date of birth and mother’s maiden name, one probably wants to know why it needs that data. One should also look for some assurance that the company understands how such data could be abused, perhaps by an employee who has not been properly vetted. I’m not sure I would be satisfied if the company’s response was: “Trust us, nobody here would think of abusing your personal data for immoral or illegal purposes; besides, we don’t do anything with the data, we just give it to this other company.”
Consider the Google Street View privacy breach, which cost the company hundreds of Euros in fines earlier this year. The truth may well be that Google never intended its camera cars to capture personal data, but they did, without notice to, or the consent of, the data owners. Personally, I’m fairly confident Google did nothing nefarious with the data. Collecting it without notice and consent was the “transgression” in that case. I use the term transgression rather than crime, but transgressions can add up to crimes based on frequency and scale, and at that point ignorance is no defense. After all, the principles of Fair Information Practice have been around since the Nixon era. If you keep violating them you will face consequences.
That brings us to Facebook, which this week accepted the judgment of the Federal Trade Commission (FTC) that the social network had “deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.” Facebook was handed a 20-year “sentence,” which means that the world’s largest social media company is now required “within 180 days, and every two years after that for the next 20 years, to obtain independent, third-party audits certifying that it has a privacy program in place that meets or exceeds the requirements of the FTC order, and to ensure that the privacy of consumers’ information is protected.”
Does that language sound familiar? It’s very similar to the recently finalized FTC settlement with Google in which the world’s largest internet company “agreed to settle Federal Trade Commission charges that it used deceptive tactics and violated its own privacy promises to consumers.” This occurred after it launched its social network, Google Buzz, in 2010.
What Google and Facebook did may not be cybercrimes, but I sure feel better knowing there is a “cybercop” on this beat, namely the FTC. Will the FTC investigate the concerns raised by Carrier IQ’s diagnostic software? I’m guessing it will. What will they find? I really don’t know, and I’m not sure who the focus of such investigations will be – Carrier IQ, the phone makers or the phone service providers.
But, here are two relevant findings in the FTC Facebook case that are worth bearing in mind: “Facebook represented that third-party apps that users’ installed would have access only to user information that they needed to operate. In fact, the apps could access nearly all of users’ personal data – data the apps didn’t need.”
“Facebook told users they could restrict sharing of data to limited audiences…[but] did not prevent their information from being shared with third-party applications their friends used.”
While companies continue to struggle to get their heads, and business plans, around the 40-year-old notion of Fair Information Practice, the real cybercriminals – the guys spending money on malware development to steal money from you, your employer and your government – are probably grinning from the sidelines. The resources for prosecuting misdeeds in cyberspace are already stretched thin, as my colleague Cameron Camp recently documented, so you might say the last thing we are legitimate companies making mistakes which require legal remedies, mistakes like collecting a whole bunch of personal data without notice to, or the express consent of, the data owners.