latest

Technology is a force for good – but how can we avoid replicating human biases?

As technology plays an ever bigger role in health and care, we need to make sure it works for everyone

25th November 2021 about a 4 minute read
"Technology can bring huge improvements in health and care, but as we introduce more and more technology, we have to be careful we don’t build in – and therefore magnify – existing biases." Greg Allen, FCC CEO

Technology is a force for good in health and care. But when it comes to medical devices, I’ve been pondering a few questions:

  • How can we be sure that these devices are having the best impact at a time of tackling wider health inequalities?
  • If research suggests a device or clinical treatment works for one group of people, does that necessarily mean it should work for everyone?
  • Could artificial intelligence (AI) actually accentuate an issue in this context, if such bias is built in algorithmically and then magnified?

The latest news suggests that there is much to be done to tackle device-related bias.  On Sunday, Sajid Javid announced a review into the ethnic and gender bias of medical devices.

The review was prompted by research finding that pulse oximeters, used to measure oxygen levels in the blood, are three times more likely to miss low oxygen levels in black patients than in white ones, potentially putting them at risk.

Ethnic bias can be present in other medical devices, such as remote plethysmography, a technology that measures heart rates by analysing live or recorded video, and works less well on dark-skinned people.

The problem arises because of an historic tendency among researchers and clinicians to treat certain bodies (young, white, male) as the default. This is particularly true in clinical trials.

Avoid magnifying existing biases 

This causes problems throughout care, not just with medical devices specifically. For example, dermatology textbooks tend to use pictures showing how conditions present on white skin, not black. Women who have heart attacks are 50% more likely to be misdiagnosed, because the symptoms in women are different from those in men. Even the PPE used during the pandemic was found to fit female and ethnic minority staff less well.

We need to support and train the NHS and social care workforce to be as alert as possible to the ways in which patients from different ethnic backgrounds may present differently with different diseases or may respond differently to particular treatments.

Technology can bring huge improvements in health and care, but as we introduce more and more technology, we have to be careful we don’t build in – and therefore magnify – existing biases. Take the example of the pulse oximeter mentioned earlier. Increasingly, patients are being given the oximeters to self-monitor at home. A doctor aware of the tendency to underestimate oxygen levels in black patients should take that into account, but if the patient data is being sent from home to a computer system designed to automatically pick up anomalous results, without human intervention, the problem could be missed.

AI increases the need to be scrupulous

Similarly, as we move increasingly towards elements of healthcare being diagnosed and even delivered by AI, we have to be particularly scrupulous in making sure that ethnic and gender biases aren’t built into the algorithm as a result of being developed on a homogenous population. There are already instances where this has proved to be an issue – for example, with an algorithm used in US hospitals that was systematically (and erroneously) assessing black patients as lower risk than white patients. This meant that the black patients were less likely to be referred to particular health programmes.

So, while technology can in theory bring more personalised care, it can, if we’re not careful, lead to the opposite – decision-making based on generalities rather than specifics.

We therefore welcome Sajid Javid’s review into bias in medical devices as an important first step, but we also need to address the much wider problem of how bias is built into all aspects of health and care, whether it’s textbook illustrations, lists of symptoms, clinical trials or protective clothing.  We hope that Javid’s review will prompt researchers and health professionals to consider how bias can be tackled throughout health and care.