How to take mobile gestures to the next level: use pitch, yaw and “the claw”

Poking at a screen with your fingers, swiping and multitouch finger movements have become a common way to interact with our screens. But even though devices like smartphones have an enormous amount of computing power, we do so much less on them because we are faced with such limited options thanks to our touch options today.

Chris Harrison, a professor at Carnegie Mellon and CTO of Qeexo, explained all of this on Wednesday at our Structure Connect event, but I had a few more questions. So I caught up with Harrison after his presentation to discuss how a new type of touch would affect a phone’s battery life, if we need new sensors on our phones to make his vision possible and what we can expect from our mobile devices’ UI five to ten years out. Watch the video below for a glimpse of the near and far future.

[youtube https://www.youtube.com/watch?v=igGMayLQGmU&w=560&h=315]

And if you missed the original talk, check it out here to understand how Harrison sees touch dumbing down our ability to use the massive amount of computing available to us.

[protected-iframe id=”2ed54626b2ba808071903590cbbc34bc-14960843-34305775″ info=”https://new.livestream.com/accounts/74987/events/3418509/videos/65658035/player?width=640&height=360&autoPlay=false&mute=false” width=”640″ height=”360″ frameborder=”0″ scrolling=”no”]

Photo by Jakub Mosur