Our selfies, ourselves: What doctors can learn from our smartphone pics

Smartphone cameras may be used for selfies more than any other purpose, but those selfies can now reveal a lot more than our levels of vanity. By taking photos of our own bodies – including body parts (and bodily fluids) – patients and physicians are now getting a far more detailed view of our well-being, tracking our health, making diagnoses, and so much more.

It helps that the cameras themselves are getting better all the time, but the true power behind them when it comes to uses like diagnostic imaging is largely in the software. So says Iltifat Husain, editor in chief and founder of iMedicalApps who is an assistant professor of emergency medicine at Wake Forest School of Medicine:

Iltifat Husain

Iltifat Husain

“What’s really happening here is the software,” he told me. “Resolution is important, yes, but you have complex algorithms translating [that data]. Can you imagine having to take that step of taking a video with your phone and uploading it to your computer and running it through an algorithm? You no longer have these steps; now you’re easily pointing a camera and then easily running an algorithm in real time. It’s the software and it’s the taking out all these steps that is bringing developers and researchers to do amazing things.”

So how is this combination of camera-plus-software being put to use in the medical world? While some of the uses are pretty straightforward – think about consulting a doc in a telehealth setting, or better monitoring a loved one’s day-to-day in an assisted living facility – images of medications, urine test strips, injuries, and even our own faces to get heart rate readings are now possible.

Check out some of the latest innovations:

  • Using its patent-pending computer vision technologies, MedSnap can identify/validate pills and even compile that data into an epidemiological map to help track counterfeit operations and techniques.
  • Integrating biosensors, this cradle and app we wrote about last week can detect a wide range of chemical and biological agents, including bacteria, viruses, toxins, and proteins.
  • Pathologists paired Nokia Lumia’s 1020 camera with a microscope to create a 41-megapixel photomicroscope to take extremely high-res images of human cells and detect a micrometer-sized parasite. Using the phone’s DNG RAW files, they were able to zoom in for more detail than they say is possible even using far pricier dedicated medical devices.
  • Colorimetrix turns a smartphone into a “portable spectrophotometer,” using the smartphone’s camera to read colorimetric test strips that measure urea, PH, glucose, etc. and thus monitor a wide range of conditions, including diabetes, urinary tract infections, and kidney disease.
  • The Mobile Heartbeat CURE (Clinical Urgent REsponse) Camera Module enables caregivers to image and track patient wounds, injuries, and even valuables held in the hospital. And to comply with HIPAA, the photos aren’t stored on the phone but auto-transmitted to the patient’s electronic medical record or a secure server where physicians and nurses can access them.
  • This one may not be ready yet for prime time, but at some point it could be possible to take a selfie to get your own heart rate. The idea is that, at least if the lighting is right, the phone’s camera can detect micro changes in the color of your skin to measure pulse.
  • Speaking of wounds, Gauss Surgical’s FDA-approved Triton Fluid Management System allows for the real-time estimation of blood loss and hemoglobin mass on gauzes and surgical sponges. Thanks to the iPad camera, the system images blood-soaked sponges and shoots them the cloud, where algorithms estimate in real-time how much blood is on the surface.
The Triton Fluid Management System. Courtesy of Gauss Surgical.

The Triton Fluid Management System. Courtesy of Gauss Surgical.

One unifying theme to many of these medical apps is that the analysis of images happens in real time. Dr. John Paul Graff, a third-year pathology resident at the University of California Irvine who is working with associate professor Dr. Mark Wu on the photomicroscope to image things like parasites, pointed to how important this kind of affordable real-time technology is in parts of the world where there are more cell phone towers than health facilities.

“Even using one of those inexpensive microscope adapters for phones, you can diagnose blood-borne diseases like malaria,” he told me. “With malaria, this parasite will live on a red blood cell 6 to 8 microns in size. So if you can get access to someone’s blood even without access to a clinic, you can determine if they have the disease and what type they have.”

From there, a pathologist anywhere in the world can analyze the images to make diagnoses. This type of patient data can in turn travel with a patient’s electronic medical record to wherever that patient may go in the world.

Graff also stressed that, as important as the algorithms behind the apps are, image stabilization has been key in his line of work pairing smartphones and microscopes. “Even the new big iPhone6+ has optical image stabilization, and that’s a big deal because it’s not just megapixels,” he said. “Those are fine, you always want to zoom in more and more, but if you have a blurry photo it doesn’t matter how many pixels there are.”

Graff and Husain are both downright giddy about what the coming years hold. Much in the way texting makes the multi-step process of sending a simple email 20 years ago look highly complex, the ability to communicate with our physicians and clinics using images and other health metrics, and to do so real-time from anywhere in the world, may soon make today look like positively prehistoric.