Researchers in Glasgow are seeking to eliminate gender bias in AI-powered healthcare monitoring systems.

University of Glasgow experts are setting out to ensure male and female patients receive the same high standards of care delivered by sensor-based devices.

Recent advances in radar sensing technology could underpin a new generation of vital sign monitoring, the experts say. 

A number of AI-enhanced vital sign monitoring systems, including the University of Glasgow’s £5.5m Healthcare QUEST, are currently in development. 

The projects are exploring the potential of sensors to track the rhythms of patients’ hearts and lungs without requiring them to wear monitoring devices or tracked by video cameras.

These less-invasive vital sign monitoring systems will be supported by artificial intelligence technology. The AI will spot the signs of an unexpected change in heart rate or respiration. If it decides medical intervention is required, it can send an alert for help.

The technology could help vulnerable groups like older people live more independently at home or in assisted accommodation. It could provide additional insight into the wellbeing of patients staying in hospital wards.

A critically important consideration for any future radar-based health monitoring system is ensuring that its artificial intelligence component is properly trained and equally capable of making the correct judgements without bias towards one gender of patients.

A team from the University of Glasgow’s James Watt School of Engineering have won new funding for a project to help that will examine the potential for gender bias in healthcare AI and find ways to ensure that AI-supported treatment remains equitable.

Dr Nour Ghadban, a research fellow in electronic and nanoscale engineering at the University of Glasgow, is the project’s principal investigator. She said: “New sensors linked with artificial intelligence could offer potentially transformational opportunities to improve the way that we monitor patient wellbeing. 

“However, we can only reap those benefits if we can be sure that the AI systems, we use to achieve them are up to the task. We know that all kinds of human bias across race, class gender and more can be unwittingly incorporated into AI decision-making tools if the proper care isn’t taken when they are being trained on real-world data.

“It’s vitally important that we try to tackle these potential issues as early as possible to ensure that patient safety can be guaranteed, and male and female patients will receive the same high quality of care.”