Google announced on Thursday its latest plans for using smartphones to monitor health, saying it would test whether capturing heart sounds and eyeball images could help people identify issues from home.
The company, a unit of Alphabet Inc, is investigating whether the smartphone’s built-in microphone can detect heartbeats and murmurs when placed over the chest, head of health AI Greg Corrado told reporters. Readings could enable early detection of heart valve disorders, he said.
“It’s not at the level of diagnosis but it is at the level of knowing whether there is an elevated risk,” Corrado said, noting questions remained about accuracy.
The eye research is focused on detecting diseases, such as those related to diabetes, from photos. Google said it had reported “early promising results” using tabletop cameras in clinics and would now examine whether smartphone photos might work, too.
Corrado said his team saw “a future where people, with the help of their doctors, can better understand and make decisions about health conditions from their own home.”
Google also plans to test whether its artificial intelligence software can analyse ultrasound screenings taken by less-skilled technicians, as long as they follow a set pattern. The technology could address shortages in higher-skilled workers and allow birthing parents to be evaluated at home.
The projects follow announcements last year about measuring heart and breathing rates using smartphone cameras – features now available on many devices through the Google Fit app.
While Google has long sought to bring its technical expertise to health care, it has said little about whether the efforts are generating significant revenue or usage.
Corrado said launching capabilities was “a major step” and adoption would take time.
“When you think about breathing and heart rate, whatever level of adoption we see today only scratches the surface,” he said.