Source: Dark Reading
20 Sep 2019
A new report explores how attackers identify psychological vulnerabilities to effectively manipulate targets.
“People make mistakes” is a common and relatable phrase, but it’s also a malicious one in the hands of cybercriminals, more of whom are exploiting simple human errors to launch successful attacks.
The Information Security Forum (ISF) explored the topic in “Human-Centered Security: Addressing Psychological Vulnerabilities,” a new report published today. Human vulnerabilities, whether triggered by work pressure or an attacker, can expose a company to cybercrime. As more organizations fear “accidental insiders,” addressing these vulnerabilities becomes critical.
In its report, ISF cites a stat from FireEye, which last year reported 10% of attacks contained malware such as viruses, ransomware, and spyware. Ninety percent of incidents were more targeted — for example, impersonation scams, spear-phishing attacks, and CEO fraud.
“What was clear for me is that if we are going to really try to address some of the more emerging threats that are targeting individuals, then we need to understand some of the ways in which users behave and why they behave,” says ISF managing director Steve Durbin. He points to a “total shift” in the way employees can be managed to optimize security. After all, he says, most don’t turn up for work each day with the intent to cause harm to the the company.
The brain has to process a lot of information before it arrives at a decision; however, humans are limited in the amount of time they have to make a choice with the data they have. This is why the mind seeks cognitive shortcuts, or “heuristics,” to alleviate the burden of decision-making. Heuristics help people more efficiently solve problems and learn new things, but they may lead to cognitive biases that contribute to poor judgment or mistakes in decision-making.
So long as companies don’t understand the implications of cognitive biases, researchers say, they will continue to pose a significant security risk. ISF’s report lists 12 biases, all of which can have different effects on security. One example is “bounded rationality,” or the tendency for someone to make a “good enough” decision based on the amount of time they have to make it.
Bounded rationality can prove dangerous during a cyberattack, when tensions run high and an analyst may make a “good enough” decision based on the data and tools at their disposal.
Another bias commonly seen in the workplace is “decision fatigue,” or a decrease in mental resources after a series of repetitive choices. At the end of a long day, employees tend to lean toward easier decisions, which may not be the best decisions. “The attacker knows by conducting the attack in late afternoon, it’ll provoke poor decision-making,” Durbin explains.Read Full Article