Rob High is an IBM Fellow, Vice President and Chief Technology Officer, IBM Watson. He has overall responsibility to drive Watson technical strategy and thought leadership. As a key member of the Watson Leadership team, Rob works collaboratively with the Watson engineering, research, and development teams across IBM.
Rob High will be speaking on the subject of artificial intelligence at Gigaom Change Leaders Summit in Austin, September 21-23rd. In anticipation of that, I caught up with him to ask a few questions about AI and it’s potential impact on the business world.
Byron Reese: Do you feel like we are on the path to building an AGI and if so, when do you think we will see it?
Rob High: Cognitive technologies, like Watson, apply reasoning techniques to domain-specific problems in things like Healthcare, Finance, Education, and Legal — anywhere there is an overwhelming amount of information that, if processed, can substantially improve the decisions or outcomes in that domain. For example, the work we’ve done with Oncologists to help them identify the most appropriate treatments for their cancer patients is based on having assessed what makes the patient unique; standard of care practices and clinical expertise that has been used to train the system; and the available clinical literature that can help doctors make better decisions. This helps to democratize that expertise to a wide swath of other doctors who do not have the benefit of having seen the thousands of patients that major cancer centers like Memorial Sloan Kettering or MD Anderson see.
The types of artificial intelligence used in these systems are spectacular in that they are able draw inferences from literature written in natural language, and to be taught how to interpret the meaning in that language as it applies to bringing the right information at the right time to the doctor’s fingertips. Unlike Artificial General Intelligence, our goal is to amplify human cognition — not to do our thinking for us, but to do the necessary research so that we can do our thinking better.
What do you make of all of the angst and concern being talked about in terms of why we should perhaps fear the AGI?
The concept of a machine-dominated world is inspired more by Hollywood and science fiction writers rather than technologists and AI researchers. IBM has been firmly committed to responsible science and ethical best practices for over a hundred years – it’s embedded in our DNA. Our focus is on applying cognitive computing to amplifying human cognitive processes, not on replacing them.
The reality is AI and cognitive technologies will help mankind better understand our world and make more informed decisions. Cognitive computing will always serve to bolster, not replace, human decision-making, working side-by-side with humans to accelerate and improve our ability to act with confidence and authority. The industries where Watson is being applied today – healthcare, law, financial services, oil & gas – exist to benefit people working in those industries.
For example, Watson augments a doctor’s abilities by aggregating and producing the best available information to inform medical decisions and democratizing expertise. But it’s the human doctor who takes the information Watson produces and combines it with their own knowledge of a patient and the complex issues associated with each diagnosis. Ultimately, the doctor makes the recommendation, informed by Watson, and the patient makes the decision – so there will always be a complementary relationship between human and machine.
Do you think computers can or will become conscious?
Today, we are making significant advances in integrating embodied cognition into robotics through Watson and that remains a primary focus. Our technology currently allows robots to – like humans – show expression, understand the nuances of certain interactions and respond appropriately. There’s still a need to teach robots certain skills, like the skill of movement, the skill of seeing, the skill of recognizing the difference between a pot of potatoes that are boiling versus a pot of potatoes that are boiling over.
However, we do believe that we’re only in the first few years of a computing era that will last for decades to come. We are currently assessing what’s doable, what’s useful and what will have economic interest in the future.
Great. We’ll leave it there. Thank you for taking the time to talk today.
Rob High will be speaking on the subject of artificial intelligence at Gigaom Change Leaders Summit in Austin, September 21-23rd.
So much for AlchemyAPI CEO Elliot Turner’s statement that his company is not for sale. IBM has bought the Denver-based deep learning startup that delivers a wide variety of text analysis and image recognition capabilities via API.
IBM plans to integrate AlchemyAPI’s technology into the core Watson cognitive computing platform. IBM will also use AlchemyAPI’s technology to expand its set of Watson cloud APIs that let developers infuse their web and mobile applications with artificial intelligence. Eventually, the AlchemyAPI service will shut down as the capabilities are folded into the IBM Bluemix platform, said IBM Watson Group vice president and CMO Stephen Gold said.
Compared with Watson’s primary ability to draw connections and learn from analyzing textual data, AlchemyAPI excels at analyzing text for sentiment, category and keywords, and for recognizing objects and faces in images. Gold called the two platforms “a leather shoe fit” in terms of how well they complement each other. Apart from the APIs, he said AlchemyAPI’s expertise in unsupervised and semi-supervised learning systems (that is, little human oversight over model creation) will be a good addition to the IBM team.
We will discuss the burgeoning field of new artificial intelligence applications at our Structure Data conference later this month in New York, as well as at our inaugural Structure Intelligence event in September.
I have written before that cloud computing will be the key to IBM deriving the types of profits it wants to from Watson, as cloud developers are the new growth area for technology vendors. Cloud developers might not result in multi-million-dollar deals, but they represent a huge user base in aggregate and, more importantly, can demonstrate the capabilities of a platform like Watson probably better than IBM itself can. AlchemyAPI already has more than 40,000 developers on its platform.
AlchemyAPI’s Turner said his company decided to join IBM, after spurning numerous acquisition offers and stating it wasn’t for sale, in part because it represents an opportunity to “throw rocket fuel on” the company’s long-term goals. Had the plan been to buy AlchemyAPI, kill its service and fold the team into research roles — like what happens with so many other acquisitions of deep learning talent — it probably would not have happened.
Gold added that IBM is not only keeping the AlchemyAPI services alive (albeit as part of the Bluemix platform) but also plans to use the company’s Denver headquarters as the starting point of an AI and deep learning hub in the city.
[protected-iframe id=”447ad6f774bfa076438dfe73b2d084db-14960843-6578147″ info=”https://www.youtube.com/embed/iHVeoJBtoIM?list=PLZdSb0bQCA7mpVy–2jQBxfjcbNrp9Fu4″ width=”640″ height=”390″ frameborder=”0″ allowfullscreen=””]
Update: This post was updated at 9:10 a.m. to include quotes and information from Elliot Turner and Stephen Gold.
First it was Jeopardy!, then it was cancer, e-commerce and cooking. Now, IBM’s Watson artificial intelligence system is powering a line of connected toys.
And it looks as if people are impressed with the idea: A company called Elemental Path launched a Kickstarter campaign on Monday for a line of toy dinosaurs, called CogniToys, and had surpassed its initial goal as of Tuesday morning. The company was aiming for $50,000 and had raised more than $70,000 as of 11:40 a.m. Tuesday.
Essentially, the dinosaurs are connected toys that speak to IBM’s Watson cloud APIs, which the company began rolling out last year. According to the Kickstarter page, the CogniToys will allow children to engage with them by talking — asking question, telling jokes, sharing stories and the like. In addition, the page states, “The technology allows toys to listen, speak and simultaneously evolve, learn and grow with your child; bringing a new element of personalized, educational play to children.”
Elemental Path is not the first company focused on building natural language and artificial intelligence into toys. Possibly the best-known example so far is a startup called ToyTalk, which is building natural language iPad apps and was founded by former Pixar CTO Oren Jacob.
The evolution of artificial intelligence, and the ability to easily train toys, robots, apps or anything, really, is going to be a major focus of Gigaom’s Structure Intelligence conference September 22–23 in San Francisco. We’ll also talk a lot about machine learning and AI at our Structure Data conference March 18–19 in New York, where speakers from Facebook, Yahoo, Spotify and elsewhere will discuss how data in the form of images, text, and even sounds are allowing them to build new products and discover new insights about their users.
IBM has struck a deal SoftBank Telecom Corporation to bring the IBM Watson artificial intelligence (or, as IBM calls it, cognitive computing) system to Japan. The was announced on Tuesday.
Watson has already been trained in Japanese, so now it’s matter of getting its capabilities into production via specialized systems, apps or even robots running Watson APIs. As in the United States, early focus areas include education, banking, health care, insurance and retail.
[company]IBM[/company] has had a somewhat difficult time selling Watson, so maybe the Japanese market will help the company figure out why. It could be that the technology doesn’t work as well or as easily as advertised, or it could just be that American companies, developers and consumers aren’t ready to embrace so many natural-language-powered applications.
The deal with SoftBank isn’t the first time IBM has worked to teach a computer Japanese. The company is also part of a project with several Japanese companies and agencies, called the Todai Robot, to build a system that runs on a laptop and can pass the University of Tokyo entrance exam.
We’ll be talking a lot about artificial intelligence and machine that can learn at our Structure Data conference in March, with speakers from Facebook, Spotify, Yahoo and other companies. In September, we’re hosting Gigaom’s inaugural Structure Intelligence conference, which will be all about AI.
IBM has recruited a couple of new partners in its quest to mainstream its Watson cognitive computing system: financial investment specialist Vantage Software and the Institute of Culinary Education, or ICE. While the former is exactly the kind of use case one might expect from Watson, the latter seems like a pretty savvy marketing move.
What Vantage is doing with Watson, through a new software program called Coalesce, is about the same thing [company]IBM[/company] has been touting for years around the health care and legal professions. Only, replace health care and legal with financial services, and doctors and lawyers with financial advisers and investment managers. Coalesce will rely on Watson to analyze large amount of literature and market data, which will complement experts’ own research and possibly provide them with information or trends they otherwise might have missed.
The partnership with the culinary institute, though — on a hardcover cookbook — is much more interesting. It’s actually a tangible manifestation of work that IBM and ICE have been doing together for a few years. At last year’s South By Southwest event, in fact, Gigaom’s Stacey Higginbotham ate a meal from an IBM food truck with ingredients suggested by Watson and prepared by ICE chefs.
But even if the cookbook doesn’t sell (although I will buy one when it’s released in April and promise to review at least a few recipes), it’s a good way to try and convince the world that Watson has promise beyond just fighting cancer. IBM is banking on cognitive computing (aka artificial intelligence) to become a multi-billion-dollar business, so it’s going to need more than a handful of high-profile users. It has already started down this path with its Watson cloud ecosystem and APIs, where partners have built applications for things including retail recommendations, travel and cybersecurity.
Watson isn’t IBM’s only investment in artificial intelligence, either. Our Structure Data conference in March will feature Dharmendra Modha, the IBM researcher who led development of the company’s SyNAPSE chip that’s modeled on the brain and designed to learn like a neural network while consuming just a fraction of the power normal microchips do.
However, although we’re on the cusp of an era of smart applications and smart devices, we’re also in an era of on-demand cloud computing and a user base that cut its teeth on Google’s product design. The competition over the next few years — and there will be lots of it — won’t just be about who has most-accurate text analysis or computer vision models, or who executes the best publicity stunts.
All the cookbooks and research projects in the world will amount to a lot of wasted time if IBM can’t deliver with artificial intelligence products and services that people actually want to use.
IBM says it has developed a machine learning system that identified images of skin cancer with better than 95 percent accuracy in experiments, and it’s now teaming up with doctors to see how it can help them do the same. On Wednesday, the company announced a partnership with Memorial Sloan Kettering — one of IBM’s early partners on its Watson system — to research the computer vision technology might be applied in medical settings.
According to one study, cited in the IBM infographic below, diagnostic accuracy for skin cancer today is estimated at between 75 percent and 84 percent even with computer assistance. If IBM’s research results hold up in the real world, they would constitute a significant improvement.
As noted above, the skin cancer research is not IBM’s first foray into applying machine learning and artificial intelligence techniques — which it prefers to call cognitive computing — in the health care setting. In fact, the company announced earlier this week a partnership with the Department of Veterans’ Affairs to investigate the utility of the IBM Watson system for analyzing medical records.
And [company]IBM[/company] is certainly not the first institution to think about how advances in computer vision could be used to diagnose disease. Two startups — Enlitic and Butterfly Network — recently launched with the goal of improving diagnostics using deep learning algorithms, and the application of machine learning to medical imagery has been, and continues to be, the subject of numerous academic studies.
We will be discussing the state of the art in machine learning, and computer vision specifically, at our Structure Data conference in March with speakers from IBM, Facebook, Yahoo, Stanford and Qualcomm, among others.
IBM’s Watson group has invested an undisclosed amount of money in Pathway Genomics to help the company deliver an app that gives personalized advice to users based on their genetic information, as well as data collected by wearable devices, medical records and other sources. Like all Watson-powered apps, users will query the new app, called Panorama, using natural language and results are delivered based on analysis from sources including medical literature and clinical trials. So far, IBM has invested in a handful of companies building apps on Watson, including in the retail and health care spaces.
IBM has announced a long list of new Watson customers and startup partners, ranging from standby industries such as health care to new ones such as cybersecurity and nonprofits. Perhaps more importantly, the company also gave developers a handful of new Watson-powered APIs.
MapR throws Apache Drill in the box, Cassandra 2.1 releases, and that’s just the beginning.
We’re unlikely to see one virtual assistant to rule them all. Rather, we’ll have a team of assistants, each aware of its strengths and weaknesses, collaborating, delegating and stratifying based on the task at hand.