The iPhone has been called a pinnacle of multi-touch interfaces. Chris Harrison, professor at Carnegie Mellon and CTO of Qeexo, doesn’t agree: He believes that a centuries-old tool like the guitar provides a far better touch interface than even the most advanced smartphone or tablet, because the guitar can capture captures a lot of rich dimensions of touch — like vibration, or how hard the guitarist is pressing — whereas touchscreens are primarily concerned with touch location.
The relative limitations of our current touchscreen interfaces are going to become even more important as smart home or smartwatch interfaces become more prevalent. The touchscreen simply wasn’t meant to be shrunk to a few centimeters across. “The smaller the interface gets, the dumber the interface has to get,” Harrison said at the Gigaom Structure Connect conference Tuesday. “Which is a huge problem for human-computer interaction.”
Harrison believes that the touchscreen can significantly improve through an approach he calls “rich touch.” It’s in the early stages, but it could be the key to matching desktop productivity on mobile devices.
“What disturbs me about touch is we actually do less sophisticated stuff on touchscreens than we do on our desktop computers,” Harrison said.
Rich touch, essentially, is teaching our computers to regard our hands as more than a pointer finger. There are a few different implementations of rich touch that can be put into effect today. For instance, your knuckle can be used to augment your pointer finger. Imagine lassoing a part of a photo, or tapping on the screen with your knuckle to bring up a contextual menu. If you’ve got a finger, you’ve got a knuckle, and in many ways it can work as a “left-click” for touchscreen interfaces.
Another way to approach rich touch interfaces is through not just the point where your finger touches the screen, but at the angle it’s touching, which turns the screen into a sort of window into a world. So poking perpendicularly to the screen means something different than grazing your fingertip across the screen. This could be a big deal for the smartwatch, and would potentially allow smartwatch interfaces to eschew physical dials for, say, scrolling. Another possibility is “drilling” your finger into the screen to turn the volume up or down. Using the front-facing camera on mobile devices, we can even incorporate hand grasp into interfaces. So if your hand looks like it’s holding an eraser, a swipe erases the screen.
“Touch is the central modality for our devices, not just smartphones, but screens in our cars, and the smart home too,” Harrison said. “When it comes to touch interfaces, we should make them richer, not just make the screens bigger.”
[protected-iframe id=”2ed54626b2ba808071903590cbbc34bc-14960843-34305775″ info=”https://new.livestream.com/accounts/74987/events/3418509/videos/65658035/player?width=640&height=360&autoPlay=false&mute=false” width=”640″ height=”360″ frameborder=”0″ scrolling=”no”]
Photo by Jakub Mosur