The COVID-19 pandemic has sent millions of Americans into a panic for nearly a year, and many have struggled to calm their anxieties at home while hospitals and other medical facilities are stretched to their breaking points, leading many households to hedge their health on preventative home medical devices.
The popular pulse oximeter, which when attached to the finger can give almost-instant readouts of a patient’s heartbeat and blood-oxygen levels, has joined the infrared thermometer in medicine cabinets across the United States. In many patients, the $20 device can signal an impending emergency by measuring the oxygen saturation in red blood cells using a combination of red and infrared light, literally checking the redness and opacity of their blood.
However, the devices can provide misleading results in over 10% of Black patients, according to a recent study at the University of Michigan. The light-powered “pulse ox,” as many medical professionals call it, maybe less effective due to the way darker-complected skin absorbs the light, researchers suspect early on. In addition to home use, medical facilities use the pulse ox to determine a baseline severity of certain conditions, especially COVID, before admitting a patient or turning them away. With the coronavirus disproportionately affecting Black and Hispanic-American populations, the device’s failure could mean life or death to many patients of color.
Pulse oximetry was developed in the 20th century by scientists across Europe, North America, and Japan, and the technology is far from novel today. Still, researchers tested the technology primarily on populations of lighter complexion—just one of several inherent racial biases in medicine obscured to this day. When coupled with other existing and opaque racial biases in healthcare, even the smallest errors in blood-oxygen-saturation readings can result in the dismissal of BIPOC patients from overburdened emergency rooms.
Furthermore, the information registered by instruments like the pulse ox is increasingly used in AI-powered patient databases that feed the algorithms that assist caregivers in vital decision-making. Such unseen errors attributed to race and skin complexion can hinder risk mitigation in vulnerable populations of color, resulting in far-reaching complications. Such problems in pulse oximetry technology were noted as far back as 2005, leaving many physicians alarmed that the problem has yet to be corrected by product manufacturers. For providers to fight racial bias in healthcare, it is crucial to eliminate any prejudicial errors in the ostensibly unbiased hardware.