COMBATING BIAS IN CRIME SCENE INVESTIGATIONS

By Jason Jones
This article was originally published in the  Georgia State Division of International Association for Identification’s  Second Quarter of 2025 news letter and is republished here with permission.

Crime scene investigation is an ever-advancing endeavor that is becoming increasingly critical in our criminal justice system. Over the decades, investigators have evolved from people who “shot some pictures” and “put some stuff in a bag” to analytical machines capable of breaking down complex crime scenes into their most basic building blocks and making sense of what occurred inside the scene. Crime scene investigators are human – they are not infallible, and we make mistakes. Some of the things that limit us are our experience, our training, and even the integrity of the scene itself. Sometimes things happen at the scene that are not related to the crime we are investigating, and we have no way of knowing it. Recently, I assisted a law enforcement agency

with a bloodstain pattern analysis, identifying two impact patterns inside the scene, only to later find out one of the patterns was caused by EMS picking up a decedent. To be fair, the defendant in this case admitted to beating the decedent, so perhaps I saw that impact pattern and attributed it to the crime because I was expecting to see it there. This brings me to my point, bias, and how it can potentially affect the conclusions we draw. I, like many of you reading this, have sat through implicit bias training programs, and sometimes it’s disheartening to be told “you suffer from bias.” I’ve been a cop, or involved in public safety/law enforcement training for a little over 25 years, and I still recall the sting of being accused of being biased. Me, biased? How insulting. I see everything openly, honestly and objectively, and I am totally free of any outside influence. How dare they accuse me of such. Here is where reality sets in . . . we have bias. All of us, even me, even you. It creeps into our minds as we go through life, sets up shop and influences us. Bias takes many forms and comes in a variety of flavors. For example, two biases that plague us as investigators are confirmation bias and desirability bias. Confirmation bias occurs when we see the things we expect to see while excluding other relevant data. For example, maybe my knowledge that I was looking at a blunt force/beating case influenced me to see an impact pattern and to exclude the possibility that some other event created that pattern on the wall. Once I saw relevant evidence to the contrary, I did change my opinion as to the pattern’s origin.

Another bias we have to worry about is desirability bias. I remember when I started policing in the late 90s, I was told “Women never shoot themselves to commit suicide.” And while men do use a firearm to commit suicide at a far greater rate, women do commit suicide using a firearm (Miller et al., 2022). So, if, for example, I believe women don’t kill themselves with firearms, and I investigate the death of a woman using a firearm, I may begin to look for evidence that supports an assault. Best-case scenario, I end up with an unsolved homicide, worst-case scenario I arrest someone for a murder they did not commit. I think we can all agree either of these outcomes is terrible.

Now, I know what some of you may be thinking, “I am way too smart to be influenced by this biased stuff.” Get ready for some bad news . . . that is the “I'm not biased bias” or blind spot bias, and highly intelligent people may be more susceptible to this bias than other people (Pronin et al., 2002). Sounds insane, but being able to process information faster, in other words being smart, may actually lead you to fall for stereotypes or for your own bias. Don’t worry, it gets worse! Many may think, “I know how to combat bias, it’s through training!” Allow me to introduce you to the Dunning-Kruger effect. The Dunning-Kruger effect is a cognitive bias that essentially manifests itself in a belief that you are more competent in a task than you actually are. Generally, this happens after you learn a new skill. As psychologist and author Adam Grant describes it, when you learn a new skill, you set firmly atop the peak of Mount Stupid, believing you have mastered some skill, only to later find out you don’t know what all you don’t know (2021, p.43). Our confidence is high, but our competence is low. Here’s an example: I used to watch a television show on the History Channel called “Forged in Fire”, the show was a knife-making competition where contestants would build a knife to certain specifications and then test this knife out, hoping that it didn't break. After watching about 10 seasons, I fancied myself an expert knife maker. I had all the knowledge but had never actually made a knife. In 2020, we moved to Tennessee and I was lucky enough to find a forge in Pigeon Forge that allowed you to make knives. I took my wife and daughter and we went to create our own knives. I opted to make a knife out of a railroad spike. I quickly discovered it is much easier to watch someone hit hot steel with a hammer on TV than it is to be the one swinging the hammer. After about five minutes my arms felt like lead and my hands were numb from the impact. Once I was done, I had what looked like a somewhat slightly deformed railroad spike. This was my humbling event that knocked me off the stupid mountain and brought me firmly back to reality. I had the broad knowledge of knife making – you heat metal till its malleable and hit it with a hammer – but, I lacked the deeper knowledge of where to hit it and how to make it take on the form of a blade. This same phenomenon happens once people take their first Bloodstain Pattern Analysis course. They have just learned a great amount of information, and now know much more than they did before about bloodstains. The issue is, we don’t know the things we don’t know yet. If at this point in the article you are doubting every decision you ever made, then GOOD! So where do we find balance and how do we overcome bias? Step one is to recognize and acknowledge that bias is real and we are subject to its influence. We have to accept that no one is fully bias-free. Naturally, we are more inclined to see ourselves as objective and overlook the subtle ways bias influences our decisions. Go to the bias training your agency offers, listen to and critically examine what people have to say about bias, then be self-reflective about your ideas. Begin by understanding we can never be too certain of what we are uncertain of. In other words, while we work cases and look for the best explanation to explain what we see or what we believe, we have to keep an open mind. Ideally, we should never be so convinced in our conclusion that we refuse to accept any other possible explanations than the one we see before us. If we are presented with new data, we must be prepared to change our minds. British socialist and playwright Bernard Shaw said, “Progress is impossible without change; and those who cannot change their mind cannot change anything.” Next, adopt the scientific approach to data analysis. If you have a hypothesis, test it. Any initial bloodstain pattern analysis courses should contain blocks of instruction on the scientific method. Have your work peer-reviewed; I believe in the blind peer review process. Essentially, I share with my peer reviewer the work I was given and allow them to form their own opinions without input from me. I don’t want to bias the reviewer. Another step in reducing bias’s influence in your work is to consider that your opinions are wrong. In other words, play devil’s advocate, take your work and try to tear it apart, and find where your opinions are weakly supported and why. Finally, slow down in your decision making. In general, when we are forced to make on-the-spot decisions about most things, we don’t make the best decisions. That is why the high-pressure sales tactics employed at automotive dealerships are so successful. If someone is rushing you to give them answers on a case, they probably don’t want the truth, they just want you to confirm what they believe. Sound familiar? Sounds a lot like confirmation bias doesn’t it?

Reference:

Grant, A. (2021). Think again: The power of knowing what you don't know. Random House. Miller, M., Zhang, Y., Prince, L., Swanson, S. A., Wintemute, G. J., Holsinger, E. E., & Studdert, D. M. (2022). Suicide deaths among women in California living with handgun owners vs those living with other adults in handgun-free homes, 2004-2016. JAMA Psychiatry, 79(6), 582. Pronin, E., Lin, D. Y., & Ross, L. (2002). The bias blind spot: Perceptions of bias in self versus others. Personality and Social Psychology Bulletin, 28(3), 369-381. https://doi.org/10.1177/0146167202286008

Previous
Previous

Bloodstain Pattern Analysis

Next
Next

From Evidence to Opinion: How Investigators Determine Cause and Manner of Death