Rising costs of medical tests and doctor consultations have prevented many people from seeking medical care. A team of researchers in the USA aim to solve this problem using the camera-equipped mobile devices most of us carry about our persons these days.
“If we can bring more tests and medical information to people directly through their phone or computer, they can start to get help and access the medical system,” says Jason Hoffman, a doctoral student at the University of Washington.
His latest foray in this field is a way to use a smartphone camera and flash as a pulse oximeter that can accurately measure blood oxygen levels down to 70% saturation.
Blood oxygen levels are helpful for monitoring patients with respiratory conditions, such as COVID-19 and asthma. These can cause a decrease in the amount of oxygen entering the bloodstream and reaching the body’s tissues. Normal blood oxygen levels are above 95%, meaning 95% of haemoglobin is carrying oxygen, but if this value drops below 90%, it can be a sign of a deteriorating clinical state and mean that urgent medical attention is needed.
To monitor patients with respiratory conditions, doctors use a pulse-oximeter, a device that clips over and shines light through the fingertip. Haemoglobin-containing red blood cells, which carry oxygen around the body, absorb this light at a particular wavelength, which changes slightly according to how much oxygen is present. Less oxygen alters the level of light absorption, which the device can measure.
Smartphones equipped with cameras are also able to measure light of different wavelengths. “Your phone is designed for photography. So, the camera will measure light at red, green and blue wavelengths,” explains Hoffman. And although the wavelengths picked up by a phone camera are not necessarily those used in a pulse oximeter, “your phone can simulate what a pulse oximeter does.”
This is because smartphone cameras can actually measure more wavelengths of light than a pulse oximeter. As a result, they can detect light reflected from cells other than blood cells in your finger, such as skin, muscle and fat. Much of the light detected will be from these other cells rather than oxygen-carrying red blood cells, which presents a problem: how to tell what light absorption is down to the red blood cells alone.
Scientists have solved this problem using a technique known as machine learning. Machine learning involves feeding a computer large sets of data with the intention of training it to search for patterns that are hard to spot by eye. By learning these patterns, the computer will be able to connect them to a particular outcome.
Hoffman and his colleagues have used machine learning to uncover a pattern that can accurately determine how much of the light detected by a smartphone camera is actually attributable to the red blood cells alone. As a result, they are able to calculate how much oxygen is present in the blood right down to 70% saturation levels – the lowest value pulse oximeters need to measure as set by the Food and Drug Administration (FDA).
“No one has shown that you can measure blood oxygen levels with a smartphone accurately down to 70%. So we’re pretty optimistic about trying to test on a larger group of people in the future,” says Hoffman.
Ensuring that this computer “machine-learned” pattern works for a diverse group of people who will want to use the application is critical though. “The next step is to test on people with a variety of skin tones and skin aberrations so that we can validate whether the app works on a wider range of people,” Hoffman explains.