The European “right to be forgotten” is an important directive for both privacy and information security advocates. With roots as far back as 1995, a European Data Privacy Directive laid the foundation—and set regulations—for how EU citizens’ personal information must be protected and handled by “controllers of personal data” (i.e., companies and internet search engines).
Laws have been updated steadily through the years to include digital sprawl. In the spring of 2014, the European Court of Justice formally created the official “right to be forgotten,” mandating that search engines collecting, processing, or listing personal data about an EU citizen, regardless of company or server location, must adhere to the laws.
In Canada, a similar regulation, the “Personal Information Protection and Electronic Documents Act” (PIPEDA) “establishes a privacy right with respect to personal information,” and “imposes similar obligations [to the EU’s directive] on organizations that collect, use, or disclose personal information.” PIPEDA does not explicitly include a “right to be forgotten,” but in some respects it goes farther; Canada’s law extends to the privacy and protection of personal information for business use within the private sector, not just search providers. A “breach of security safeguards,” including the loss of, unauthorized access to, or unauthorized disclosure of personal information, will result in fines of up to $100,000. The fines, for larger companies, isn’t particularly steep, though that kind of settlement could cripple a smaller organization, one with, theoretically, fewer resources to secure customer data in the first place.
Buy it, use it, break it, fix it
While both laws focus on privacy and data owner (vs. data controller) rights, security is a big part of the equation. If personal information is leaked or stolen, it impinges on citizens’ rights, and companies, therefore, are held liable if found negligent.
Trash it change it, mail, upgrade it
Security professionals might be quick to reply that the aforementioned risks are the reason security teams must be included in discussions are technology use. Yes, that is one conclusion. Another conclusion is that companies are being allowed to collect and use too much data for too long. The more data collected, and the longer it’s stored, the bigger the information security problem. Companies have an obligation to protect data collected, even if that data is outdated. Handing some (or all) of that data off to another party does not abdicate responsibility. And frankly, companies whose entire business model is buying then publishing online directories of personal information should be outlawed. Even if inaccurate or outdated, the more personal information available, the more potential for it to be used or misused. How many social engineers leverage what is easily found on the surface web—to say nothing of the deep or dark web—that exposes citizens to harm? And the U.S. doesn’t even have law like the EU or Canada. We just let it happen.
Regulation, laws, and audits aren’t the answer to every security problem. Security certainly has watched as compliance mandates have cost companies bundles of money but failed to protect systems and data. Compliance doesn’t equal security, and even the fines for non-compliance with the EU and Canadian directives aren’t a huge deterrent. When mega breaches hit companies and hefty cleanup costs are added, even that doesn’t seem to be enough incentive for companies to put all of the right protections in place. Security vendors develop great products to detect intrusions and breaches, but yet the best IPSs and firewalls aren’t robust enough to handle the massive amounts of data and growing attack surfaces for modest-sized organizations.
Charge it, point it, zoom it, press it
So maybe it’s time to start looking at the data, not encryption or segregation, but the data itself. How much is too much data? Do companies have a right to store data indefinitely? Who will know (or care if) excessive data storage is happening if that company is never brought to court? Do organizations have a moral obligation to limit what they collect, and more importantly, what they share or sell? Let’s say, hypothetically, a company couldn’t care less about its customers, how does collecting, storing, and processing massive amounts of data increase the company’s threat landscape, and is that enough to change behavior?
Further, security awareness needs to extend outside the four walls of the business world so consumers can be better informed. Yes, many security practitioners have been beating that drum for a while (and it’s gotten somewhat better), but consumers need to fully understand what is really happening with their data. The “Security is complicated” excuse isn’t valid anymore. Security is everyone’s business. And if consumers start voting with their wallets, making decisions based on how well businesses protect—and can prove they protect—customer data, we’ll start to see the real change in the industry.
Hard? Yes. Worth it? Definitely.