Can you predict discrimination in the workplace by measuring implicit bias?

  1. Home
  2. Latest
  3. Implicit Bias and the Workplace

Can you predict discrimination in the workplace by measuring implicit bias?

Can you predict discrimination in the workplace by measuring implicit bias?

I read a thought provoking piece on the BBC website about Implicit Bias, amplified by Radio 4’s Analysis programme. I found the topic quite interesting from an employment/recruitment/diversity point of view, so I did some reading round the subject (well quite a lot of reading).

As the name suggests, implicit bias is the bias that we harbour unintentionally. It may be bias against a different ethnic group, the opposite gender or even against people with a different sexual orientation. Implicit bias can be measured (so the theory goes) by the use of Implicit Assessment Tests (IATs). Harvard’s University run the Project Implicit site which contains a variety of IATs and the tests have been run for around 18 million participants.

In brief, in the IAT for testing racial bias, the subject is shown words and faces where he/she has to associate good/bad words with black and white faces. The computer measures their reaction times which it plugs into an algorithm, which in turn, generates the score. IATs are often used in diversity training sessions to demonstrate that even those who pride themselves on their liberal, multi-cultural ethos can harbour in-built prejudices. In this test, most subjects show some kind of pro-white, anti-black bias. So does that mean we’re all a little bit racist? Are those tests predictors of discriminatory attitudes or behaviours in the real world?

One of the biggest complaints about the use of IATs is what might be called a test-retest problem. To be scientifically valid a test must produce the same result no matter how many times it’s carried out (or at least be close enough allowing for a statistical margin of error). That doesn’t appear to be the case with IATs and the published literature on the subject suggests that the test-retest reliability of IATs is far too low to be scientifically valid.

The other thing is to look at what the IAT is actually measuring. It measures reaction times. In the early versions of IATs there was a correlation between cognitive processing speed and IAT score; those who were cognitively slower got higher IATs (thus suggesting that they harboured more implicit bias) so (at the risk of highlighting my implicit bias against older people and showcasing my discriminatory assumptions), an older person might be told that they were more implicitly racist that a younger person simply because of the younger persons faster cognitive processing speed.

Having said all the above, am I denying that implicit bias exists? Not at all. I am (and I use the word advisedly) “comfortable” that implicit bias exists. How can it not? We live in a world where we’re bombarded with images of stereotypes that reinforce those biases. To quote one researcher, “Implicit bias is really just an emergent property of the larger historical and structural inequities we have."

So, is there a place for the IAT? As a predictor of real world behaviour, my reading suggests to me that there’s much more work to be done on them before we can say with any confidence that someone with a high IAT score will go on to act in a discriminatory way and that IAT scores can be used as the basis of any sort of decision making in the employment arena. However, they do have a role to play in demonstrating that we do harbour unconscious prejudices (however much we like to think that we don’t) and that highlighting those prejudices and addressing them through diversity training will ultimately pay dividends for employers in less disputes in the workplace.

For more information, please contact Jon Taylor.