top of page
Writer's pictureLeMareschal

The importance of educating Security Personnel and Intelligence Analysts about biases

Updated: Apr 11, 2023

Bias is a topic that many industries like to avoid, and the security and intelligence industries are no exception. However, there is a profound need to discuss biases in regard to the security industry and when educating security practitioners and intelligence analysts. When the subject of biases has been raised in the past, the majority of commentators cannot seem to agree, in fact, they will often argue against the existence of biases and/or why there is a need to discuss them in the first place. In this article, we would like to address the topic of bias – what bias is, who has biases, are biases wrong, and what types of biases there are. Then, we will highlight WHY it is important for security professionals and intelligence analysts to be able to identify their biases and address them, and, then, we will share HOW one can identify his/her biases.

Now before we start, there is one thing on which we can all agree: As a security professional, you don’t only make assessments about incidents or places, but also about people. Keep this in mind as we proceed further, we will come back to it.

What is bias?

To answer this, we will use the definition according to the American Psychological Association

1. partiality: an inclination or predisposition for or against something. See also prejudice.

2. any tendency or preference, such as a response bias or test bias.

3. systematic error arising during sampling, data collection, or data analysis. See biased estimator; biased sampling.

4. any deviation of a measured or calculated quantity from its actual (true) value, such that the measurement or calculation is unrepresentative of the item of interest. —biased adj.

There are a few key words from the definition — predisposition, against, tendency, preference”. Keep those words in mind when thinking about how they affect the threat assessment of a security professional. While you do that, think of a scenario when a security guard has to assess, either by observation or by interviews, any visitors in the area for which he/she is responsible. That security guard believes that women are less likely to commit a crime (bias) and, during his/her threat assessment, he/she misses the fine details that a woman is, in all actuality, a terrorist. You think perhaps this couldn’t occur? Well, it has actually happened. In July 2017 in Mosul, a female suicide bomber, holding her child in her arms, managed to walk by security guards and detonate her bomb.


The security guards, instead of being observational and watching her hands (in which she was holding the detonator), just saw a mother with her child. Many people see women as weak and incapable of committing acts of terror, especially one who is carrying her own child. This is not the only incident when ‘’miscalculation of threat or of threat actors’’ was catastrophic.

Do all people have biases?

Before we answer that, ask yourself, “Are there people, things, or ideas you like better than others? Are there places/events where you feel more comfortable than others?” We are sure your answer to these questions is “yes” and that is because all humans have biases. Some biases are passed to us through evolution and some are learned through socialization and/or direct experience. One must understand that biases serve a purpose. Simply put, because the human brain has the tendency to categorize information, people, events, experiences, etc. during his/her learning and development process, the brain will connect the new information and people to past experiences. Once that is done, the brain will respond to it in the same way it does to other things belonging to that same category. So, by putting people with similar traits into a specific category, one believes that everyone else in that category must be the same. Biases are not limited to race, gender, ethnicity, religion, social or political groups but many characteristics may be subjected to one’s biases such as physical appearance, sexual orientation, educational level, profession, etc.

Are biases wrong and racist?

When discussed, the majority of people tend to disregard biases, believing that even acknowledging those biases will label them as racists. The first mistake when talking about biases is when someone considers someone else good or bad based on his/her biases.

According to Matt Grawitch, PhD ‘’Biases make decision-making easier by giving us a starting point, an initial prediction, or a “leaning of the mind” regarding which choice to make. We anchor our original judgment in the biased conclusion and then adjust it based on supplemental information.’’

Having biases is not necessarily bad, wrong, or racist. In fact, we’ve discussed that biases improve the decision-making process and help the human brain to categorize new information. We could say that since biases help us simplify information processing, they basically function as rules of thumb that help us make sense of what is happening around us and make faster decisions.

However, biases can become bad and even dangerous when we treat or judge someone unfairly or when the accuracy of the decision is of the utmost importance, such as behaviour or threat assessment. In addition, what can make a bias shift from ok to “bad” is when an individual allows their biases to influence their decision-making process in such a way that they allow those biases to affect someone else in a negative fashion by either being unfair or causing a miscalculation in the threat level.

Not being able to recognize and address our biases can lead to neglecting or discounting information that would be valuable for our job functions. Information that we process and use to make decisions can directly affect a risk/threat and vulnerability assessment, an interview with a suspect, the analysis of intelligence and data, or the use of link analysis in putting together an intelligence report. In these situations, biases can become a systematic thinking error that can cloud our judgment, and, as a result, impact our decisions, thus rendering our final product limited or even useless.

What types of biases do people have?

People can have conscious biases (biased attitudes toward specific ideologies, events, groups of people, etc. that we are aware of) or unconscious biases (biases we are not aware of, cannot control, are difficult to access and can quite often influence our actions more than conscious biases).

In one of her articles, Kendra Cherry mentions that ‘’some of our cognitive biases are related to memory. The way you remember an event may be biased for a number of reasons and, that in turn, can lead to biased thinking and decision-making. Other cognitive biases might be related to problems with attention. Since attention is a limited resource, people have to be selective about what they pay attention to in the world around them.’’

If you are aware of a biased attitude, it is more likely and consciously possible for you to be able to address it during your decision-making process. However, the unconscious biases are the most ‘’dangerous” ones since it often takes specific training and study of yourself to be able to identify that you have them. Here, Carly Hallman is listing 50 types of unconscious biases. Have a look and see how one or more of them can affect your decision-making process.

  1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.

  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.

  3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.

  4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.

  5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.

  6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)

  7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.

  8. False Consensus: We believe more people agree with us than is actually the case.

  9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.

  10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.

  11. Availability Heuristic: We rely on immediate examples that come to mind while making judgments.

  12. Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.

  13. Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.

  14. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.

  15. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.

  16. Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.

  17. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.

  18. Anchoring: We rely heavily on the first piece of information introduced when making decisions.

  19. Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.

  20. Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.

  21. Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.

  22. Confirmation Bias: We tend to find and remember information that confirms our perceptions.

  23. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.

  24. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.

  25. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.

  26. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.

  27. Declinism: We tend to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.

  28. Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.

  29. Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.

  30. Gambler’s Fallacy: We think future possibilities are affected by past events.

  31. Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.

  32. Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.

  33. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.

  34. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.

  35. Authority Bias: We trust and are more often influenced by the opinions of authority figures.

  36. Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.

  37. Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.

  38. Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.

  39. Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.

  40. Zeigarnik Effect: We remember incomplete tasks more than completed ones.

  41. IKEA Effect: We place higher value on things we partially created ourselves.

  42. Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.

  43. Bystander Effect: The more other people are around, the less likely we are to help a victim.

  44. Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.

  45. False Memory: We mistake imagination for real memories.

  46. Cryptomnesia: We mistake real memories for imagination.

  47. Clustering Illusion: We find patterns and “clusters” in random data.

  48. Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.

  49. Optimism Bias: We sometimes are over-optimistic about good outcomes.

  50. Blind Spot Bias: We don’t think we have bias, and we see it on others more than ourselves.

WHY security professionals and intelligence analysts must address bias training?

As a security professional or intelligence analyst, seeing what biases are and how they can significantly affect us, do you see how important it is to recognize and address them during the decision-making process? Do you see how biases can affect your risk and threat assessment, information gathering and analysis as well as behavioural assessment while you are conducting a first interview with a visitor, suspicious person, etc.?

We will give you an example. During the Manchester arena attack investigation, one of the security guards claimed that he did feel something was “off” with one of the terrorists but he was uncertain of how to approach and ask questions (first interview of a suspect) because he was afraid he was going to be labelled a “racist’’.

Being trained in how to recognize and address your biases will not only help you to make a better decision but will also give you peace of mind and confidence knowing that you are approaching and properly interviewing a person whose presence seems to be unjustified and/or suspicious. You will be able to clearly gather more information and assess the risk without feeling that you are merely racially profiling that person. You will also build more awareness of the subjects with which you hold biases and that awareness will lead to more choices. More choices will lead to a more ‘’open mind’’ and allow you to seek further information before you make a decision.

In connection to why biases and the training on them are important and related to the security industry, we must mention here Richard Gasaway, Ph.D, the creator of the Center for the Advancement of Situational Awareness and Decision making, has highlighted the fact that ‘’Confirmation bias is particularly challenging to situational awareness because it can prohibit the uptake of critical clues and cues that can foretell impending doom.’’

Now that we have discussed the many aspects of biases, what they are and how they can affect your decision-making process do you want to test yourself and find out what biases you have? You can use one of the many online tests available, the Implicit Association Test (IAT) created by Harvard.

This will help you assess and better identify all that biases you or your staff may have that can affect risk and threat assessments as well as intelligence gathering and analysis. In addition, your staff’s performance and how they interact with others to make sure their decision making will be as accurate as can be ascertained from the information provided and not just from their own personal biases.

If you are an individual interested in receiving training in biases or you represent an organization looking to train your employees in this very much needed and important topic, please reach out to us.

Chris Grow

AUS Global Special Services Travel Team

Managing Partner LeMareschal LLC

Denida Zinxhiria Grow

Founder & CEO

Managing Partner LeMareschal LLC

10 views0 comments

Recent Posts

See All

Comentários


bottom of page