The National Cyber Security Alliance (NCSA) and Nasdaq recently hosted their third Cybersecurity Summit. This June 2017 event, hosted in New York, focused on the inherent challenges of humans in the digital world. While it is clear that simplification and automation are key tools for addressing those challenges, the discussion identified key areas of human impact, both adverse and innovative.
Topics tackled at the NCSA/Nasdaq event included:
- The growth of consumer technology, reliance on that technology and a debate about the future level of cyber awareness we will all need
- Redefining “work in cybersecurity” to ensure our enterprises and governments are properly staffed to operate in the digital economy
- Identifying pockets of behavior linked to human error, and the tools we can leverage to minimize the security risk of human error.
As one of the participants on the Minimizing Human Error panel, I had the privilege of joining a lively discussion about the innate risks of human behavior and a variety of mitigation processes. I raised this important point: in a connected world, our families are becoming a whole lot larger! And, as our third party ecosystems expand so too does our risk related to humans and their inevitable error. To minimize that particular security risk, we all need to use the full complement of tools available to us.
Those tools define a clear path to mitigation and, moreover, to innovation. For example, we must take the time to identify common behaviors and their impact on cybersecurity. We are all familiar with the simple response to demands for highly complex and lengthy passwords – a sea of sticky notes with saved passwords, so we can “remember.” Yet, that natural response inevitably increases the risk of unauthorized access. Now, compound that behavior across the entire population of our digital ecosystems. Recognizing the human reality of this security memory burden, NIST's recent draft SP 800-63-3 proposes a solution to the problem: the use of highly personal passphrases that are the memorized secrets of each user. For a summary of NIST's passphrase guideline criteria see Greg Master's recent SC Magazine article “Shift in Password Strategy from NIST”.
Working with our third party ecosystem partners, we can leverage a comprehensive architecture and automation to accommodate some aspects of human error. Yet, we are still left with the reality of the human factor. It is up to us as security advocates and practitioners to build humanity in!
What might building humanity into our security architectures, process, devices and applications look like? Consider the following as a foundation:
- Use an SDL mindset and apply it to all physical and operational security, both hardware and software.
- As part of the development and deployment of security processes, evaluate the likely human “workarounds” or mistakes. Then build to accommodate them.
- Offer flexibility in security implantation – a rigid architecture will not necessarily result in a higher percentage of the security goals being achieved. Often suggesting the final goal rather than prescribing a specific method of how to get there has a greater degree of success.
Let me offer an example of the need to build humanity in. Last year a fatality occurred due to a combined technology/human failure of Tesla's Autopilot. The technology failed because it could not recognize a white tractor-trailer against the sunlit background and did not engage the brakes. The human failed because despite being able to see vehicle in front of him - he had grown accustomed to relying on Autopilot. The reality: the handoff between machine and human did not adequately address innate human behavior. If the car drives itself, we do not pay attention and, in fact, do not even bother to hold onto the steering wheel.
To quote a cliché, “to err is human.” That's why I firmly believe that human-machine teaming is the answer. Because the combination of human and artificial intelligence is more powerful than either alone. And that together, they can make our cyber world a safer place.