IBM didn’t have to flaunt its debatable cloud dominance over Amazon Web Services on the sides of public buses if it wanted to upstage the cloud kingpin at its user conference this week — Big Blue could have just led with the news that its famous, Jeopardy!-champ-destroying Watson system is now available as a cloud service.
That’s right: Developers who want to incorporate Watson’s ability to understand natural language and provide answers need only have their applications make a REST API call to IBM’s new Watson Developers Cloud. “It doesn’t require that you understand anything about machine learning other than the need to provide training data,” Rob High, IBM’s CTO for Watson, said in a recent interview about the new platform.
More on the the details later, but first the big picture. If IBM actually delivers a workable cloud platform around Watson and developers actually take advantage of it to build new, smart applications, it will be a big fricking deal.
Advanced computer science behind a simple command
When I wrote recently about the power of cloud computing to enable wholly new types of applications, this is one aspect of what I was talking about. Watson took years of man-hours and millions of dollars to build. Its creators were some of the smartest computer scientists around and, as IBM Senior Vice President Steve Mills pointed out to me recently, it’s comprised of approximately 40 different technologies. Now, anyone who can write some code can theoretically write an application that taps into Watson’s abilities without knowing anything about any of those technologies himself.
IBM stands to make money from the Watson Developers Cloud but the primary goal is to create a large community of developers in the world of cognitive computing — “what we believe is the dominant form of computing in the future,” High said. “We’ve come to the conclusion that this is too big and important to hold to our [ourselves],” he noted.
Indeed, IBM has been trying to grow the community and capabilities of cognitive computing, even beyond what Watson can do around understanding language. The company recently launched a university partnership that focuses on numerous aspects of cognitive computing, including the field of deep learning that is driving significant advances in computer vision and other facets of text analysis and natural language processing. And IBM has for years been mapping brains and working on microchips that mimic the brain’s architecture.
The real beauty of these types of systems is not just in the intelligence of the computers, but also in how they affect the thought processes of people using them, High said. He noted an early Watson user who learned pretty quickly after using Watson that he had been asking the wrong questions of his data all along.
“When you get a very rapid response to our questions, we drive our level of concentration much more deeply,” High explained. “And in concentrating more deeply, we think about things we haven’t thought of before.”
Platforms need apps
Watson has already proven itself in a few professional fields, most notably health care and retail, but its seems logical that a large community of users not constrained by a dearth of talent or budget will be able to think of many novel applications.
Early applications built using the Watson cloud platform (scheduled for 2014 release) include a personal health assistant from TK firm Welltok; a tool from MD Buyline for recommending patient treatments and medical equipment purchases; and an app from Fluid that acts as a personal shopper for visitors to e-commerce sites. Outdoor clothing maker The North Face will be using the latter app on its site, which will provide dynamic text feedback as well as images, reviews and other content based on a customer’s question (e.g., “What do I need for a hiking trip to Zion National Park in January?”)
In order to encourage programmers to take advantage of the platform, IBM is working with venture capital firms — including New Enterprise Associates — to support and fund startups using the Watson API. Right now, though, access is via “controlled invitation,” which means interested developers must apply for access here.
Aside from the computing resources to analyze users’ data and then compute answers when API calls come in, the Watson Developers Cloud also includes an SDK, an app store, a data marketplace (the more data Watson has, the more it can learn) and IBM experts to assist in everything from design to beta testing (for the time being, these services are where the company expects to make money, High acknowledged). Watson will return results of queries along with a confidence score and links to data that weighed heavily on its answer.
All the help is necessary because even though the APIs are designed to be simple, the idea of using something like Watson to power an intelligent app can be daunting — like coming to terms with the fact that a cognitive computing application isn’t programmed as much as it learns and adapts from the data its creator provides.
According to High, “All the things software developers have learned about programming a computer sort of go out the door.”