RSA Conference: The need for human factors research
Ninety percent of computer users don't read the agreements when they sign up for an online service. Most users still use “password” as their password. If there's a ‘little lock' on the website, it must be safe, right? A number of scenarios demonstrating these points were captured in a series of videos presented during an RSA Conference panel, The Psychology of Security. The two panelists for the session were Jakob Nielsen, principal at the Nielsen Norman Group and former research VP at Apple Computer; and Bruce Schneier, renowned author and chief security technical officer for BT.
The reality is most computer users – even those in the workplace – make judgments for the websites they visit and the applications they use based on uninformed, surface-level assessments. Most computer users determine their trust based on the overall feeling they have. If it "feels" professional and secure, then to them it must be professional and secure.
Computer users lack the knowledge they need to make good decisions and are forced to make decisions based on assumption. Their perception of what is safe, secure and trustworthy become reality in their minds – and they want to move forward beyond the roadblock. It feels safe to them – and they really want to buy that cool new gadget – so they make the decision to use the site and buy the item regardless of the consequence. The reality is, however, that they often get duped.
“It is nearly impossible for computer users to make good judgment calls,” BT officer Schneier said during the discussion. People were able to determine the size and credibility of a company by looking at their building(s). If they looked big, the company was probably trustworthy and likely to be around for a while, he said. "This method of judgment fails to directly translate to the internet as there are plenty of websites that appear to be large and stable, but simply do a good job at creating a feel-good presence.”
Because users make their computing decisions based on feeling, the economic incentive is to make the user feel secure. One way to accomplish this is to provide the user with a secure environment and make them notice. Another way is to make the user think they have a secure environment and hope that they don't notice otherwise. This is where messages, such as "your anti-virus scan completed successfully" and "the latest security update was just applied to your system," come in handy for the security vendors. Conversely, this is where a prompt, such as "do you want this application to access the internet," fails.
“The users don't have the knowledge to make these decisions,” said user advocate Nielsen. “Moreover, they don't care to have the knowledge to make them.” Education is lacking – and because there is a lack of a desire to learn, education will almost always fail. Password rules are widely known and well understood, yet most people are still using ‘password' as their password. This is a prime example where education alone does not work, he said.
“Users are not rational in the way we would like them to be," Schneier added. "They make rational decisions, but the decisions that are rational to them may actually appear irrational to most security vendors.”
This means that security vendors can no longer rely on the user to make some of the tougher decisions expected of them, he added. One could argue that most, if not all, of the security decisions currently presented to the user should not be presented at all. They should be made in their best interest, by the vendors, on their behalf. These decisions, and how the user will interact with the system based on these decisions, require an in-depth knowledge of how the users think.
To gain this in-depth knowledge, human factors research must play a much larger role in software development – specifically security software development, Schneier said. Without this research and the resulting developments, they both agreed, we will continue to rely on assumptions, false perceptions, and the unfortunate reality of uninformed, irrational-looking decision-making.