Jonathan raised his eyebrows as he stuffed his phone into his pocket. “I just talked to Cameron,” he said. “She told me there’s a confirmed case of COVID at the hospital where she works.” Five minutes later, Fritz, another Kolide employee sitting to my left, rolled his chair away from his desk, spun towards me, and said, “I just read a report there’s a confirmed case at a clinic in Somerville.” Everyone in the office heard him. They looked at him and then looked at me. It was undeniable, the pandemic was here, and as the CEO, I had to make a decision.
“I’ve heard enough, we’re working from home,” I said. “Let’s shelter-in-place for at least a few weeks, and then we will reassess.” The date was March 10th, 2020. Over the next 90 days, countless businesses arrived at the same decision and directed millions of eligible employees to work from home. For many, it was a rough start, fraught with productivity issues, gaps in essential processes, and a woefully inadequate capacity on VPNs, mail servers, and chat services.
Now on the first anniversary of us shifting to remote work, most of the pressing issues have been addressed. Companies have purchased, integrated, and secured a tight ecosystem of products and services to facilitate all of their previous in-person interactions. And employees have mostly thrived, despite the magnitude of the change.
But now, with the pragmatic problems solved, ethical issues dominate. As the CEO of a burgeoning endpoint cybersecurity startup, I’ve had unique visibility into how COVID-19 and the mass-migration to remote work have reframed perspectives and re-ignited debates about the balance between privacy and security. Last March, on my second day of working from home, I turned on my computer and was greeted by the following email:
“We’re investigating how we can go about implementing a work-from-home policy in light of COVID-19. Our workspace is monitored via CCTV cameras so that we can always look back at how data was accessed. Is there a way that we can use Kolide to constantly record all screen actions, from boot-up to shut down, on our company laptops?”
Then all at once, dozens of similar questions from IT and security practitioners poured in. A week or so later, another deluge of emails, but this time from concerned employees who had Kolide’s software installed on their devices. They wanted to hear that our software couldn’t be used to spy on them.
It’s easy to see both sides. Organizations that made promises to customers and auditors needed to completely reinvent their device management and cybersecurity strategy to support a world where most of their workforce never leaves home. On the flip side, their employees had to grapple with the reality that their corporate devices could surveil their activities in their own homes.
I had a conversation with an HR manager recently who told me they were incensed when they discovered new hires were receiving company-branded swag that included an item that covered the webcam in the company-issued laptop. In the past, these items were given as a security measure for employees to cover their webcam on the slight chance their computer was hacked. Now post COVID-19, these same items took on a new insidious implication for some new hires: cover the webcam, so the IT team can’t spy on you.
If it’s shocking how a simple webcam cover could get misconstrued by a new hire, it shouldn’t be. Since the pandemic started, people have been regularly bombarded with human interest stories in the media highlighting smug-looking business leaders all too proud to share how they’ve maintained employee productivity through creepy monitoring software. Unlike typical device management apps, this software primarily tracks worker’s keystrokes, mouse movements, takes screenshots, and reports their entire web browser history.
These extreme cases are not typical, but when employers do not clearly define the ethical and privacy parameters that govern how the IT and security team should interact with devices, employees have no choice but to fill in the gaps and assume the worst.
While I have taken this moment to steer my own company and the industry towards Honest Security: an attempt we’ve developed at defining practical rules of engagement between the IT team and the end-users in this new world, other vendors have taken the opposite approach. They have capitulated many of their customers’ surveillance-based demands to capitalize on the increased demand for device surveillance and management software.
I saw this demand as well. Even as recently as a few weeks ago, I had an interaction with a customer incensed they could no longer use our service to silently survey Wi-Fi access points near an employee's laptop. While this capability sounds innocuous, it’s possible to use the data to precisely geographically locate the coordinates of a laptop to a terrifying level of accuracy, often enabling someone to tell the exact room in the house the target laptop sits in. When I pointed out the privacy concern, he replied that he was disappointed that Kolide could decide what he could and could not do when he pays for a service. He maintained it was up to him how he handles the information we provide and the information he queries. Finally, he said if Kolide continues to tell him what to do, he will not continue the service.
While it’s easy to understand this person’s frustration, remote work has altered the lens we look through when judging the appropriateness of how employers can use the capabilities of work devices toward their own ends. Most would not question an employer's ability to monitor radio signals in an office setting, but now with the office a distant memory, those same actions feel invasive and unnecessary.
Future ethical considerations
For every detractor, we see a crowd of IT and security professionals eagerly seeking solutions that let them meet their audit requirements and keep the data on devices safe while also respecting employee privacy and productivity. These are practitioners who recognize the obvious: If administrators continue to surveil laptops oppressively, end-users will circumvent these measures using personal devices. Once enough employees do that, it’s only a matter of time before an unmanaged laptop with critical data winds up lost, and what would have been a minor incident becomes a headline.
So the drumbeat for more privacy grows louder every day. Vendors like Apple have made it clear they intend to bet the farm on end-user privacy initiatives. Users of these devices are already seeing the impact of these efforts in the form of visual prompts which ask them to consent to sharing private data actively, and this approach has increasingly become the norm. While enterprises that supply employees laptops can work around these restrictions today, it’s easy to imagine a future where shifts in public opinion, new privacy legislation, and technological innovations come together to impose those exact consent requirements on unsuspecting employers.
Companies that want to stay on the right side of these upcoming seismic shifts need to prepare now. At a minimum, examine the software’s capabilities to manage and secure devices, and the precise data the company’s tools can obtain. That simple exercise could reveal a glut of unnecessary data collection or intrusive features that the security team can disable. From there, organizations should consider how well-informed their employees are about the ethical lines the IT and security teams may have already informally established. Even writing these down to distribute them later can lead to meaningful conversations that will clearly define appropriate and respectful guidelines.
So if there’s one point security and IT professionals and business leaders can agree on it's that having imperfect processes and rules are far better than the alternative: waiting for the next incident and simply winging it.
Jason Meller, founder, and CEO, Kolide