Despite decades of cybersecurity training on phishing and social engineering, attacks continue to be highly successful. This is due to the fact that these attacks exploit human vulnerabilities and organizational weaknesses and not technology weaknesses that can more easily be addressed with tooling.
This article explores the reasons behind this persistent vulnerability, focusing on the human factors at play and the necessary changes organizations could implement to mitigate these threats. To effectively address this, it is crucial to first understand the underlying human factors that continue to contribute to the success of these attacks based on the psychological triggers used by attackers.
Human Factors Driving Phishing and Social Engineering Success
Cognitive Bias
Cognitive biases are our inherent human tendencies that lead to systematic deviations from rational judgment and decision-making. These biases can be exploited as attackers keep finding new ways to manipulate individuals into taking actions that they would not otherwise take, such as clicking on a malicious link or divulging sensitive information. The three primary triggers in this category include those acting on our sense of urgency, authority and trust.
Urgency
Attackers frequently exploit the human tendency to react impulsively under pressure. They attempt to trigger the bypass of rational thought and induce victims to make hasty, ill-considered decisions by creating a sense of urgency.
This can be achieved through various tactics, including setting artificial deadlines and claiming that immediate action is required to avoid negative outcomes. Attackers may also fabricate emergencies by inventing scenarios that evoke fear or anxiety which prompt victims to act quickly without verifying the situation.
Authority
Social engineers also leverage authority, or the appearance of it, to manipulate their victims. This tactic exploits the human tendency to obey authority figures as in almost all cultures this is the way people are educated to understand the hierarchy in government, business and even family. Attackers may impersonate individuals, such as company executives or government officials, or forge emails, websites, or documents to appear legitimate.
They may also use authoritative language, official-sounding jargon, or create a sense of fear to intimidate, confuse, and pressure victims into compliance. Additionally, attackers may exploit social or norms where people tend to culturally defer to authority figures.
Trust
Attackers frequently exploit the human tendency to trust others, especially those they have a relationship with. One way they do this is by taking advantage of existing relationships; they may impersonate colleagues, friends, or family members to gain trust.
Another tactic is to build rapport and a sense of familiarity using social engineering tactics and may combine this leveraging social proof, such as testimonials or endorsements from others, to create a false sense of legitimacy and trustworthiness. Finally, attackers may feign empathy, offer assistance, or establish a personal connection to build rapport and trust with their victims before making their requests.
Emotional Manipulation
Emotional manipulation involves using someone’s feelings to control their behavior or decisions. In the context of phishing and social engineering, attackers exploit emotions like fear, greed, or curiosity to trick individuals into actions they wouldn’t normally take, such as clicking a malicious link or revealing sensitive information.
Fear
Attackers frequently use fear to manipulate their victims. The fear of exposure, a common tactic, involves threatening to reveal sensitive, embarrassing, or incriminating information about the victim unless their demands are met. This information could be personal, financial, or professional, and the threat of its exposure can cause significant distress and anxiety.
The use of fear can be particularly effective in social engineering attacks, where the attacker manipulates the victim’s emotions to gain trust and access to sensitive information or systems. By exploiting the victim’s fear, the attacker may cause them to give up information used to bypass security measures and achieve their objectives.
Greed
Attackers often leverage human greed, a strong motivator, to manipulate their victims. They may tempt individuals with promises of significant financial gain, exclusive opportunities, or other enticing rewards in exchange for actions such as divulging sensitive information, granting unauthorized system access, or transferring funds under false pretenses.
Attackers may also target those experiencing financial difficulties, exploiting their desperation and vulnerability to coerce them into compromising their personal or professional integrity. The deceptive allure of “get-rich-quick” schemes, and other fraudulent investment opportunities, discounts or prizes often preys on individuals’ greed.
Curiosity
Attackers frequently exploit human curiosity to manipulate their victims. This tactic often involves crafting emails or messages with captivating or shocking subject lines to entice the recipient to open and engage with the content.
These messages may contain sensational news stories, provocative images, or alarming warnings to pique the recipient’s interest. Once the recipient opens the message, they may be exposed to malicious links, attachments, or social engineering tactics designed to steal sensitive information or install malware.
Information Overload
The influx of data and information in the modern workplace has reached a point where employees are feeling overwhelmed and struggling to keep up. Remember the early days of the technology revolution that promised doing less work but being more productive because the computer was going to free up our busy lives.
The constant stream of emails, notifications, messages, and updates both professionally and personally creates a sense of information overload, making it difficult for employees to focus, prioritize tasks, and make effective decisions.
Fatigue
The demanding nature of work and personal responsibilities mixed with a constant stream of security alerts and warnings, can exhaust employees. This fatigue can result in significant judgement lapses where employees might begin to dismiss genuine threats leading them to inadvertently click on malicious links due to exhaustion and lack of focus.
Distraction
The modern workplace and life is filled with distractions. Employees often juggle multiple tasks and face constant interruptions, which can negatively impact their ability to maintain cognitive focus.
These distractions increase the likelihood of errors and such mistakes can have serious consequences, potentially leading to data breaches, malware infections, or other security incidents.
Normalization
The occurrence of frequent, albeit minor, security incidents can lead to a dangerous sense of complacency within an organization. Employees might become desensitized to security threats, viewing them as routine and not warranting serious attention.
This normalization can result in a lax attitude making the organization more susceptible to attacks because when employees fail to take security threats seriously, they are less likely to follow procedures diligently, creating the opportunities for cybercriminals to exploit.
Change Tech or Change Thought
The techniques mentioned above — cognitive bias, emotional manipulation, and information overload — are not new in the realm of cyberattacks. They have been exploited for years to deceive and manipulate individuals with some cyber criminal organizations even employing professionals in the field to assist them in attack design.
The question that arises, then, is why these issues persist despite over more than two decades of concerted efforts to combat them. We have implemented policy enforcement, conducted awareness training, fostered a culture of reporting suspicious activity, and even implemented consequence management yet the problem remains pervasive across industries.
Are these human factor triggers simply so ingrained in our cognitive processes that they are impossible to completely eradicate? Are we destined to live with the constant threat of social engineering attacks and susceptibility to phishing, much like we have learned to coexist with the persistent threat of computer viruses for more than 30 years?
Still The Weakest Link
It’s a sobering thought that despite technological advancements and increased awareness, the human element continues to be possibly the weakest link in cybersecurity. The reality is that attackers are constantly evolving their tactics, finding new ways to exploit our inherent human vulnerabilities. As we become more aware of one type of attack, they simply shift their focus to another, preying on our fears, our trust, and our desire for convenience.
We will likely continue attempting to eliminate human susceptibility to these triggers through focused security awareness training, and by developing more robust and adaptive defense mechanisms such as advanced threat detection using artificial intelligence. This may actually create a situation where ongoing education and training focused on building resilience to social engineering tactics is reduced as we become more dependent on tools and systems.
An Alternative Approach
Organizations have historically invested significant resources in addressing security concerns largely from the technological and procedural standpoint. Think about the annual ritual of confirming that you know the company’s policies and abide by the acceptable use policy.
However, if the root of these issues lies in human behavior, then the solution may not be found in stricter procedures and protocols, but rather in enhancing human-centric skills. This could include even more training to recognize and mitigate inherent biases, fostering stronger and more resilient team members that are less susceptible to emotional manipulation, and implementing strategies to manage workload and stress more effectively.
While many Human Resource departments may argue that they already incorporate these elements into their programs, it does not recognize the distinction between Human Capital Development and Organizational Risk Management. The former focuses on enhancing individual skills and performance for career advancement, while the latter prioritizes identifying and mitigating risks that could jeopardize the organization’s overall security and well-being.
Therefore, to truly address the human factor, organizations may need to shift their approach from a purely technical and procedural perspective to one that recognizes and prioritizes the human element. This involves not only training and drilling employees on security protocols but also equipping them with the skills and knowledge to identify and manage risks arising from their own inherent biases, emotional vulnerabilities, and workload pressures.
By integrating these human-centric elements into their risk management strategies, organizations can create a more secure and resilient environment that could be less susceptible to human factors.
The challenge of human factors in cybersecurity will remain a complex and multifaceted one. We are human after all. It requires a holistic approach that addresses both the technological and the human aspects of the problem and while we may never be able to completely eliminate the threat of social engineering and phishing attacks, we can certainly mitigate their impact by recognizing and working on our internal triggers.