Smartphones never cease to amaze me. I’m still impressed by how productive I’m able to be on my Android device no matter where I am (often to the chagrin of my wife), and I’m still surprised every time I see someone pull out a Square when it comes time to pay (like happened last night at Fat Choy in Las Vegas, a way-off-strip place you should totally check out if you’re in town). But neither of those situations really compare with busting out a phone in order to detect the levels of toxins in the air.
Yet that’s exactly what a group of researchers at the University of Illinois have created — a cradle that wraps around an iPhone and turns it into a biosensor that can detect, according to a university press release, “toxins, proteins, bacteria, viruses and other molecules.” Inside that cradle are about $200 worth of mirrors, lenses and a photonic crystal that the researchers claim can identify these substances as accurately as a $50,000 spectrophotometer in the lab.
The cradle is essentially there for support, though, while the phone’s camera and processor do the real work. With everything firmly aligned in front of the camera, a scientist would simply snaps a photo and the CPU processes the result. What it’s processing is the difference in wavelength that the photonic crystal, primed to react to a specific molecule, reflects. The team demonstrates the device and app in the video embedded below.
And if you’re into this type of mobile data collection, another group of University of Illinois researchers actually created a smartphone-powered water-pollution device called MoboSens
Like all things mobile or sensor, though — from SkinScan (now SkinVision) to health care apps like Ginger.io — the biggest value might come from data that has nothing to do with what the app is primarily measuring. Rather, when data about a certain condition, air quality or what have you is tagged with time and geodata, for example, it becomes the basis for mapping how situations are spreading or where there might be safe haven.
Imagine a team of scientists with iPhones dispersed throughout a city after a disaster, painting a real-time picture of what areas are most affected by a particular toxin (or maybe radiation). Taking a longer term approach, researchers could track how situations are evolving over time. Throw in even more data that smartphones are capable of detecting — temperature, ambient noise, vibration, etc. — and we might unlock entirely new ways to think about how diseases spread through the air or what conditions tend to favor the spread of foodborne bacteria.
In some ways, though, this is more than another cool thing you can do with a smartphone. It’s the furtherance of something we’ll discussing in depth at our Structure conference next month, which is how we rethink IT when computation and data are no longer bound within a single server or even the corporate network somewhere. The biological data this app will collect isn’t much use locked inside the phone; it needs a way to reliably and securely connect with other datasets and other services, likely distributed across the country or even the world. That’s where the real opportunity lies.