There is definitely some sort of Zen that makes Apple, well, Apple. Taking something obvious, and making it somehow better, somehow cooler, somehow new. How do they do it? By observing how consumers interact with technology and experimenting ad nauseum internally, until they get it exactly the way they want it. This includes abandoning the past if it no longer makes any sense. So when Apple’s own Tim Cook declares that merging a refrigerator and a toaster is not good for the consumer, he may just have a point.
But as forward-thinking as the company is, perhaps Apple hasn’t created a new path at all. Through a technique of observe, perfect and discard, Apple has been heading for some time now in one direction — along the pre-defined path into the era of ubiquitous computing.
Ubiquitous computing defined
In some ways, this path is as logical as Moore’s Law. Look at the history of computing — from the mainframe era where there was one computer for many consumers, to the personal computing era where there was one computer for each consumer, to this new era where there are many computers for each consumer — and compare of the number of computer chips to the number of consumers using those chips. At its foundation, ubiquitous computing could be summed up by this simple principle of ratios.
The modern concept of ubiquitous computing originally came from Mark Weiser in 1988 from the Computer Science Lab at Xerox PARC (sound familiar?). The theory proposed a seamless, almost invisible connection between consumers and computers that would help drive a change in ratios from one computer to many people, to many computers to one person.
Apple’s tabs, pads and boards
Even considering the most radical interpretation of ubiquitous computing dust, the main point has remained the same. We will soon be overrun by computer chips. There are, however, three very distinct platforms in this well-defined post-PC era that we have all become accustomed to. Not unlike the three platforms we see evolving within the iOS platform today:
Gesturing tabs: Mobile technology already had small chips, powerful batteries, geolocation services and wireless networking. But that was not enough to win over the masses and drive us all to purchase multiple computing devices. It was the way consumers interacted with these smaller devices that needed to change. For a long time, it was thought that voice recognition was going to propel us into the next era of computing, but that never happened.
Leveraging the fact that there were approximately 100 million iPod users, Apple was able to use convergence to its advantage as it introduced these iPod users to a series of simple touch-based gestures on a nearly buttonless device. In the early years of the iPod, we all were trained on the scroll wheel. With touch-based gestures on a wide open screen, this paradigm was taken one step further. Just as the mouse accompanied the transition from the terminal-based Mainframe Era to PC era, the post-PC era was ushered in by a new way of interacting with other computer chips, touch.
Revolutionary pads: As soon as people became familiar with this new way of interacting with computers, it was time to challenge the personal computer paradigm directly. Netbooks attempted to continue the personal relationship with consumers by maintaining the 1:1 ratio. Tablets such as the iPad are more specialized and were never meant to be a total replacement for a traditional and general purpose personal computer.
The rapid rise and immediate success of the iPad was proof positive that consumers were ready for a third major computing device in their lives. With “pads” being used by pilots, students and doctors and in restaurants, kitchens and at work, the iPad was proving to be a specialized place-based appliance rather than a personal computer with a more general purpose. As powerful as the third generation iPad is, it will never replace the personal computer, just as the personal computer never really replaced the mainframe.
Experimental boards: The current AppleTV may be a stretch to accept as a computing platform as it has no keyboard, no mouse, no touch display, and just a very simple IR remote. That is unless you happen to be near one with a Mac or iOS device. Then the AppleTV becomes an extension of that device on a much larger screen. Although it is marketed along side the iPod, it is just as closely related to the Airport Express. Perhaps Apple needs to look towards Nintendo’s Wii or Microsoft’s Kinect, otherwise the AppleTV will be doomed as just an accessory to their Tabs and Pads.
Take a look at what HBO has done with the XBox Kinect as an example. If Apple’s recently awarded gesture based patents are any indicator, this may be where it is headed as well. The interaction between consumer and computer chip has not been ironed out enough to fully see this final platform — the boards of ubiquitous computing — take hold of our day-to-day life.
One human relation-chip
Making each device “aware” of how consumers use all of the other devices they own is the key to accelerating the adoption of more than one computing device. While Apple may in fact be the only company in the world to have constructed a homogeneous synergy between its personal and its ubiquitous computing platform, it is certainly not the only company trying to forge the relationship between the user and the computer chip. For the relationship between consumers and computing devices to become truly invisible, these new smart devices will need to know more and more about the consumers who own them. For instance, the devices will need to know everything consumers have done in the past, what they are doing now and even what they plan on doing later.
Perhaps this is the reason Tim Cook stated that Apple’s “best years lie ahead of us.” With technologies like iCloud and Siri, Apple will likely play a larger and larger role in forging the relationship between consumers and the growing number of computing devices in our daily lives. It is not about selling more of these individual devices, it is all about enabling the relationship between an individual and a collection of specialized devices. And Apple knows this.