The future of technology means making the computer disappear

GigaOM’s recent RoadMap conference in San Francisco featured a number of thought-provoking speakers on the topic of the future of technology, including Twitter co-founder Jack Dorsey, venture investor Mike Moritz and former Sun Microsystems founder Andy Bechtolsheim. While many views were expressed, one thread that ran through many of the different presentations, from mobile and design to health and communication, was the idea that successful technology involves making the computer invisible to the user, even as it becomes more powerful.

I took a look at this idea in a recent article for GigaOM Pro (subscription required). Dorsey, for example, said that the power of an information network like Twitter doesn’t have anything to do with the technology behind it. It doesn’t matter, for example, that the service is now processing more than 250 million tweets a day. Dorsey said that for him, the most powerful aspect of the service is how it can help connect us to others in far-flung parts of the world, as it did earlier this year during the demonstrations in Iran.

The Twitter co-founder said that he has also tried to make the technology in his other company — mobile payments–processing startup Square — as invisible as possible, so that retailers and other entrepreneurs can use it easily to expand their businesses and make them more efficient. Said Dorsey:

Both [Twitter and Square] are great at encouraging more face-to-face human interactions . . . I believe strongly that this information and these tools help us be better, but we need to be sure, as builders of tools, that it’s not overwhelming, that it’s meaningful, and that it’s not distracting. That it’s not something that puts technology first; it puts humans first.

Meanwhile, Mark Rolston of frog design (which famously helped design the original Macintosh) talked about how computers and other advanced technology are already beginning to disappear into our surroundings and devices, and that he expects this to accelerate in the future. Rolston said that it doesn’t take much to think about combining voice technology, like the kind Apple (s aapl) has in Siri, with the kind of processing power we have now to create a computer that uses any available surface (a wall, a mirror, etc.) as a screen.

Rolston imagines an extension of the kind of physical interface that Microsoft’s Kinect uses, where gestures and even facial recognition could be used to control all kinds of processes or devices and where computing power behind the scenes would allow us to interact with our homes in different ways. Computers would become “externalized resources in a room.” In that kind of environment, Rolston said, “I can talk at it and wave at it, and maybe I have a keyboard or maybe there are screens or cameras around, but [the computers] compose in the moment as we need them.”

This concept of hiding the computer can be seen emerging in other areas, too, including health-related devices like the UP from Jawbone. Many of them appear to be just fancy jewelry — in the UP’s case, a somewhat geeky-looking bracelet — but they contain as much computing power as a desktop computer probably did a decade ago. The UP tracks your activity and records your steps, just like some other devices do, but it can also be programmed to alert you when you have been inactive for a while, and it watches your sleep patterns so it can wake you at the right point in your sleep cycle. So it has a tremendous amount of sophisticated software inside it, but it looks extremely simple — all the complex parts are hidden.

As this phenomenon accelerates, companies of all kinds are going to have to adapt to this ubiquitous computing environment, both by making their products as noncomputer-like as possible (something Apple has always excelled at) and by taking advantage of the intelligence and connectivity being built into even the smallest objects around us. For more details on what is required in order to do that, please read the full article at GigaOM Pro (subscription required).

Post and thumbnail photos courtesy of Flickr users See-ming Lee and Angry Julie Monday