Loading...

Study: Google can use eyes to predict heart attack risk


A scan of the back of the human eye left, and how Google's algorithm sees it, right. The green highlights are parts of the algorithm found most helpful in predicting blood pressure. Photo Credit: UK Biobank/Google

By looking at the human eye, Google’s algorithms were able to predict whether someone had high blood pressure or was at risk of a heart attack or stroke, Google researchers said Monday, opening a new opportunity for artificial intelligence in the global health industry.
The algorithms didn’t outperform existing medical approaches such as blood tests, according to a study of the finding published in the journal Nature Biomedical Engineering. The work needs to be validated and repeated on more people before it gains broader acceptance, several outside physicians said.
But the new approach could build on doctors’ current abilities by providing a tool that people could one day use to quickly and easily screen themselves for health risks that can contribute to heart disease, the leading cause of death worldwide.
“This may be a rapid way for people to screen for risk,” Harlan Krumholz, a cardiologist at Yale University who was not involved in the study, wrote in an email.
Google researchers fed scans from the retinas of more than 280,000 U.S. and British patients into its intricate pattern-recognizing algorithms, known as neural networks. Those scans helped train the networks on which telltale signs tended to indicate long-term health dangers.
Medical professionals can look for similar signs by using a device to inspect the retina, drawing the patient’s blood or assessing risk factors such as age, gender, weight and smoking habit. The algorithms taught themselves by reviewing enough data to learn the patterns often found in the eyes of people at risk.
The true power of this technology is that it could flag risk with a fast, cheap and noninvasive test, letting people know if they should come in for follow-up.
The research, one of an increasing number of conceptual health-technology studies, was conducted by Google and Verily Life Sciences, a subsidiary of Google’s parent Alphabet.
Krumholz cautioned that an eye scan isn’t ready to replace more conventional approaches. Maulik Majmudar, of Massachusetts General Hospital, called the model “impressive” but noted that the results show how tough it is to make significant improvements in cardiovascular risk prediction.
When presented images of the eyes of two people, one who had a major cardiac event within five years of the photo and the other who did not, the algorithms correctly picked the patient who fell ill 70 percent of the time.

Post a Comment

emo-but-icon

Home item

SPONSORED ADS

Popular Posts

Contact Us

Name

Email *

Message *