William Webster

written by

William Webster

Researcher, Avoncourt Partners GmbH

Culture Blog - Nov 30, 2018

Smartphones and Selfies sourcing AI Healthcare Diagnostics

Who would have thought that a daily device could be transformed into expert healthcare diagnoses?

Smartphones contain vital data

Almost all consumers now have access to devices with sensors that can collect valuable data about their health.  Smartphones with step trackers and apps that can track a heartbeat prove that a growing proportion of health-related data is generated on the go.

Collecting and analyzing this data – and supplementing it with patient-provided information through apps and other home monitoring devices – can offer a unique perspective into individual and population health. AI will play a significant role in extracting actionable insights from this large and varied treasure trove of data.

“As a society, we’ve been pretty liberal with our digital data,” says Omar Arnaout, MD, Co-director of the Computation Neuroscience Outcomes Center and an attending neurosurgeon at Harvard’s Brigham and Women’s Hospital. “But as things come into our collective consciousness like Cambridge Analytica and Facebook, people will become more and more prudent about who they share what kinds of data with.”

However, patients tend to trust their physicians more than they might trust a big company like Facebook, he added, which may help to ease any discomfort with contributing data to large-scale research initiatives.

“There’s a very good chance [wearable data will have a major impact] because our care is very episodic and the data we collect is very coarse,” said Arnaout.  “By collecting granular data in a continuous fashion, there’s a greater likelihood that the data will help us take better care of patients.”

Selfies for diagnostics

Nowadays it’s becoming beneficial to harness the power of portable devices for other purposes. Experts believe that images taken from smartphones and other consumer-grade sources will be an important help to clinical quality imaging – especially in underserved populations or developing nations.

The quality of cell phone cameras is increasing every year, and can produce images that are viable for analysis by artificial intelligence algorithms.  Dermatology and ophthalmology are early beneficiaries of this trend.

Young woman holding her smartphone up to her face to verify her identity (ID) with biometric facial recognition software. Facial recognition is a system whereby a user's face is scanned to verify his or her identity.

Researchers in the United Kingdom have even developed a tool that identifies developmental diseases by analyzing images of a child’s face.  The algorithm can detect discrete features, such as a child’s jaw line, eye and nose placement, and other attributes that might indicate a craniofacial abnormality.  Currently, the tool can match the ordinary images to more than 90 disorders to provide clinical decision support.

“The majority of the population is equipped with pocket-sized, powerful devices that have a lot of different sensors built in,” said Hadi Shafiee, PhD, Director of the Laboratory of Micro/Nanomedicine and Digital Health at BWH.

“This is a great opportunity for us.  Almost every major player in the industry has started to build AI software and hardware into their devices.  That’s not a coincidence.  Every day in our digital world, we generate more than 2.5 million terabytes of data.  In cell phones, the manufacturers believe they can use that data with AI to provide much more personalized and faster and smarter services.”

Using smartphones to collect images of eyes, skin lesions, wounds, infections, medications, or other subjects may be able to help underserved areas cope with a shortage of specialists while reducing the time-to-diagnosis for certain complaints.

“There is something big happening,” said Shafiee.  “We can leverage that opportunity to address some of the important problems with have in disease management at the point of care.”