As we move to a more digitally enabled future, information will continue to grow at an exponential rate. This information will be required in order for us to make better decisions through predictive analytics, Machine Learning and Artificial Intelligence.

To misquote Spiderman’s Uncle Ben, “With great power, comes great responsibility”. (I suggest it is misquoted, since in Spiderman’s Amazing Fantasy #15, the original version of the phrase appears in a narrative caption in the comic’s last panel. It was not spoken dialogue, but as narrative “AND A LEAN, SILENT FIGURE SLOWLY FADES INTO THE GATHERING DARKNESS, AWARE AT LAST THAT IN THIS WORLD, WITH GREAT POWER THERE MUST ALSO COME – GREAT RESPONSIBILITY!”)

As responsible businesses that have the ability to yield great power, we will be required to protect that information, not only because we need to comply with regulation, but also, because it’s the right thing to do.


Cybercriminals use phishing and social engineering to defeat data and system security by exploiting human behaviour and psychology to manipulate individuals into unknowingly providing confidential information.

Approximately 95% of cyber attacks and events involve preventable human error and behaviour weakness


Brain, Heart, Balance, Emotion, Intelligence, Symbol

In his bestselling book, Thinking fast and slow, Daniel Kahneman, world-famous psychologist and winner of the Nobel Prize in Economics for his work around behavioral economics talks about two modes of thinking, our System 1 brain and our System 2 brain.

The average adult makes about 35,000 remotely conscious decisions a day. It would be impossible for humans to be analytical about each of these decisions. We would never get anything done. We would be in a constant state of deliberation.

It is for this reason, the human mind evolved. Kahneman’s research suggests that our System 1 brain evolved to make faster, quicker decisions. These decisions are more automatic and intuitive, and the brain uses mental shortcuts and patterns to make those decisions more rapidly. These decisions tend to be more emotionally based, intuitive and require little effort. In fact, about 95% of our decisions are made using our System 1

Our System 2 brain is quite the opposite. This system of thinking is much more analytical and deliberative thinking. This type of thinking consumes more energy and time, and is often used for complex problem solving and deep thinking. It is slow and reliable type of thinking.

Below are some examples of our the two systems in action.


Are the two red lines the same length?

While you may be familiar with this, and you must have known that the lines are the same length, they are still perceived to be different lengths. To confirm, use the drag function below to see how the two lines are the same length.


System 2 thinking requires more analytical thinking. Let’s look at this example:

In a lake, there is a patch of lily pads.

Every day, the patch doubles in size.

If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half the lake?

Scroll down below the image for the answer.

Frog, Water Frog, Frog Pond, Animal, Green, Green Frog

If you guessed fourty seven, you would be correct. The pond doubles on the last day. Our system 1 brain tricks us into thinking that the right answer is 24, as this is half of 48. The answer is not difficult to get if we used our analytical System 2 brain and thought about our answer, however, we are often too busy and don’t have time to be analytical about these things, and jump to the quickest answer.


Because thinking is hard, in order to create these mental shortcuts, our System 1 brain often creates shortcuts, and uses cognitive biases as the basis for these shortcuts. Buster Benson, listed 175 of these biases into categories, highlighting when we use these biases.

In summary, we create mental shortcuts more often when:

  • There is too much information | So we only notice big changes, or where there is some level of weirdness, we also notice repetition, and also look for information that helps confirm certain beliefs.
  • Not enough meaning | When there is not enough meaning, human tend to fill in the gaps with patterns and generalities. We also tend to give others the benefit of the doubt.
  • Not enough time | When we are pressed for time, as we often are, we more often assume that we’re right, we also believe that we can do this, and look for the easiest option in order to achieve something.
  • Not enough memory | As humans, we also try to save space by editing memories down, generalising information, and keeping examples handy, instead of specifics.

Some examples of these cognitive biases are presented below:

Loss aversion bias

This bias suggests that giving up an object is greater than the utility associated with acquiring it. This is exacerbated if there is scarcity in the object.

You are offered a gamble on the toss of a coin. If the coin shows tails, you lose R10,000. If the coin shows heads, you win R15,000.

Is this gamble attractive?

Would you accept it?

While the odds are better, and rationally, you should take the gamble, for most people, the fear of losing R10,000 is more painful than the hope of gaining R15,000.

Authority bias

This bias is the tendency to attribute greater accuracy to the opinion of someone that is an authority figure and be more influenced by that opinion.

This is because we are taught to respect authority from a young age, we are taught to respect our parents, respect the police, respect our teachers.

The Milgrim experiment is a great example of authority bias in action.

The experiment uses someone, which we will call the “experimenter” as a person of authority. This experimenter is dressed in a white lab coat to give impetus to his authority. The experimenter asks the a person, who we will call the “subject” to act as a teacher, and to test on a series of questions. Each time the “learner”, an accomplice actor in the other room, gets the answer wrong, the subject must administer a shock. The learner in the other room pretends to feel pain. Each time the question is answered incorrectly, the larger the “fake” shock, and the more the learner pretends to feel pain.

In the experiment, irrespective of the pain, because of the authority figure present, the test subject continues to administer more power shocks.

50% of the subjects obeyed the authority of the experimenter, irrespective of the pain it was inflicting on the “fake” learner. Irrespective of the anguish presented, the subjects continued to administer the more painful shocks.

A video of the experiment is seen below:

Availability heuristic

The availability heuristic considers the human tendency to use information that is recently available or that comes to mind more easily and quickly, in order to make decisions about the future.

An example of this is where investors may judge the quality of an investment based on information that was recently in the news, ignoring other relevant facts (Tversky & Kahneman, 1974).

Imagine this scenario, you are sitting in the airport, waiting to board your aeroplane. You have a newspaper in hand, and turn to the second page. There is a headline that reads “Indonesian airlines, Boeing 737 crashes into the Java sea. All 62 people onboard have died”. Your anxiety and heart rate starts to increase, and you become more nervous to fly. While statistically, your chances of the plane crashing has not changed, the information at hand at that moment causes the bias towards flying.

The availability heuristic can lead to sub-optimal decision-making, because memories that are easily recalled are no sufficient and they do not provide enough depth to allow us to figure out how likely things are to happen again in the future.

It’s also drives more short-term decision-making, where we do not consider the systemic nature of the problem, but instead, look for a quick resolution to a problem, without considering the broader effects.


Threat actors, who are the perpetrators of these malicious activities, thrive on individuals’ fears, uncertainty, chaos, opportunity and greed. 

According the with World Economic Forum, working from home has exposed us all to more cybercrime.

Covid, increased unemployment, fear of losing our jobs, lack of bonuses, or reduced salares has resulted in people being more desperate and susceptible to “Get rich quick” scams.

Additionally, remote working has blurred the line between private communications infrastructure and personal devices.

According to a report from cloud technology firm Datto, ransomware is still the number one threat; such attacks have increased both in number and in sophistication.

Cybercriminals leverage information that is easily available. High publicity promotions such as Black Friday sales, global events, politics and of course pandemics are amongst the most common strategies employed.  

In 2020, the World Health Organsiation reported a 500% increase in phishing attacks.

These threat actors rely on techniques that catch the eye of their intended targets and prey upon their hopes and anxieties.

The phishing attacks targeted key human behavioural traits such as:

  • Human desire to help others – (our altruistic-selves)  
  • Authority bias                           
  • Pro-social behaviour 
  • Manipulation of trust     
  • Urgency to donate (Time Scarcity bias)

The below video featuring Cristiano Ronaldo scoring a goal in the dark, elegantly shows how practice can improve our unconscious competence.

The conscious competence learning model (Chapman 2015) can be used to explain the challenges.

Let us use the example of learning to drive a car.

Unconscious incompetence

The first stage of the learning cycle is where the learner is completely unaware of their incompetence. Before we even get behind a steering wheel, as a kid we may have seen our parents driving the car, you see other cars on the road and it seems incredibly easy. The thought process is, I can do that. It’s not difficult at all. They can do it with such ease.

Conscious incompetence

The next stage of the learning cycle is where you get behind the wheel of a car for the first time, you hold down the break, press in the clutch, as as you release, the car jerks and stalls. You do this a few times, and you realise that it is not as easy as it looks. You are now consciously incompetent. You know that driving a car may not be as easy as you think, and that you may not be as good as you thought you would be.

Conscious competence

After getting someone to be your driving instructor and persevering, you are now able to drive a little better. While you are able to use a vehicle, you need to consciously be aware of your surroundings, you need to look around a lot more in order to anticipate, you are conscious of when and how much you need to push the accelerator, when you need to break. You are conscious of the passenger’s jerking heads when you depress the clutch and release it. While you can drive, you are more actively driving, and making many more conscious decisions each time you get on the road.

Unconscious competence

The final stage is where you have been doing this for a while, you get into the car, and instinctively know what to you. You continue thinking while driving, however, the thinking is happening intuitively, and subconsciously. Decisions continue to be made, however, concentration can be widened. You are more aware of the broader surroundings and risks, rather than just being focused on the technical aspects of driving.

What does this all mean

Research shows that most people believe that they won’t be victims of a cyber attack (“it will never happen to me”). When it comes to Cyber Security, most people fall within the unconsciously incompetent stage. User’s are mistakenly informed and think that they can spot the signs of a phishing attack. In fact, the The KnowBe4 Cyber-security threats in Africa report indicates that around 64% of people believe that they would recognise a security incident if they saw one, and yet, they don’t know what ransomware is.

The report also highlights that 53% of Africans surveyed think that trusting emails from people they know is good enough when it comes to protection from malicious activity.


Classic phishing emails using availability heuristics

Phishing examples using loss aversion and authority bias.

Infected attachments

Credential phishing

Illicits a  confirmation bias.

The authentic look and feelof a phishing site/attachment signals to a user that this is in fact a legitimate website.


The best protection from these threats are built around a solid security awareness programme to help build a better human firewall as a last line of defense. While traditional security awareness programmes are good, they have challenges:

  • Difficult to track the outcomes of information security awareness programmes
  • Often security awareness doesn’t modify behaviour
  • Security is often led by technical teams, who often struggle with articulating the importance of Protection of information for the end user.
  • The cost of security awareness, if done internally, can be demanding
  • Often focused on  company processes and policies instead of outcomes

How can we overcome some of these challenges using behavioral economics for good?

KnowBe4 is the world’s largest integrated Security Awareness Training and Simulated Phishing platform

Simulated phishing attacks significantly decrease employee’s vulnerability to these types of attacks.

The salience of clicking on a high-risk email combined with the immediate feedback provides positive reinforcement

Repeated simulated phishing tests significantly  lower eventual click rates, thus proactively reducing risk and improving cyber resilience.(Gordan et al 2019 )

The tool effectively removes the sting from a phishing attack. When a user clicks on a simulated phishing attack, there is no risk to the business. Instead, she is presented with a message informing her that she has clicked on a simulated phishing test, and also provides awareness on red flags to be aware of the next time an email of a similar nature is recieved.

Users are also able to track their personal risk scores, based on their activitity.

These scores are also rolled up into department and company-wide scores, where security administrators are able to track changes in behaviour over time.

For a free, no obligation phishing test (up to 100 employees), please click on the following image

For more information on Behavioral Economics, information security, or to find out more how you can use KnowBe4 to improve awareness of information security with your employees, please feel free to contact us:


+27 (82) 649-9509


+27 (82) 416-9457

%d bloggers like this: