By Charles King, Pund-IT, Inc. September 25, 2015
Perceiving, understanding, reasoning and learning are as common as air in the human condition but the deep complexity of those cognitive processes makes them among the hardest nuts to crack in information technology (IT). You’d think it would be easier after IT’s decades of accomplishment and success. But doing so requires vendors and developers to breakdown individual processes into dozens of component steps, and solve scores or hundreds of problems arising from each.
That was what made IBM’s Watson technology so fascinating when it arrived on the scene in 2011 as a contestant on the Jeopardy! TV game show. It’s also what made the company’s launch this week of new Watson technologies at a new IBM Cognitive Computing Hub in San Francisco so impressive.
IBM understands—has always understood, really—that effective cognitive computing is a far flung destination where arrival requires patience, dedication and investment. The San Francisco Watson event demonstrated that the company is making substantial progress. There’s certainly more to come but IBM seems well on the way.
The importance of cognitive APIs
So what did IBM announce this week and what does it mean to its cognitive computing efforts? Consider first where Watson began. As noted at the San Francisco event by Rob High, IBM Fellow and CTO of the Watson group, the system that competed on Jeopardy! was designed to solve problems associated with question and answer (Q&A) scenarios. That is, to parse questions about specific subject areas and quickly respond with a high likelihood of success.
While a sophisticated process involving one application programming interface (API) and deep Q&A-specific capabilities, some might consider that initial system a techno parlor trick like, say, memorizing all the words beginning with the letter “A” in a dictionary. But it marked an auspicious beginning and IBM found enough promise in Watson’s results to press on.
By the time the company launched the Watson business group in early 2014 with the intent, as the organization’s SVP Mike Rhodin noted, to “put Watson to work,” IBM had just three partners and a handful of interested clients. In San Francisco, Rhodin stated that today over 350 companies, including Watson-specific start-ups, plus 77,000 developers accessing the Watson Developer Cloud are concentrating on the platform.
To further speed and expand their progress, IBM announced over 25 new Watson APIs powered by more than 50 new technologies. Those include significant advances in natural language perception, including the IBM Watson Natural Language Classifier (broadening the platform’s understanding of intent and meaning) and IBM Watson Concept Insights (to expand and relate concepts attached to the meaning of words).
Other offerings focus on vision and speech perception APIs, as well as additions to IBM Watson’s developer tool portfolio. The San Francisco event also highlighted efforts by Watson partners, including more than a dozen that offered demos or testimonials. Those included Coalesce (business decision making processes), Touchcast (Web video-based collaboration tools) and Wayin (digital marketing technologies). Wayin’s CEO Scott McNealy (formerly CEO of Sun Microsystems) drew laughs by commenting, “Me at an IBM event… Not in Kansas anymore.”
The broader context
The point of IBM’s efforts is to revolutionize data access and analysis in order to, as Rhodin put it, “Democratize information on a global scale.” This won’t be easy since 90%+ of the digital data created is unstructured and semi-structured information largely unusable by traditional systems. In short, Watson is what IBM believes will be the answer to numerous large and complex questions.
Should Watson and the company’s goals be taken seriously? Consider the fact that this isn’t IBM first or only such effort. For example, in 1997, the company’s Deep Blue became the first computer to beat a world chess champion, Garry Kasparov, in a best of six matches. The results were a pretty big deal in 1997 though they probably received a fraction of the attention that Watson did for its Jeopardy! appearances.
But the larger point was how Deep Blue’s development broadly impacted IBM. The system required the company to delve deeply into massively parallel computing technologies analyzing high numbers of possible solutions. That eventually sparked commercial IBM solutions in financial modeling, risk analysis, data mining and drug development. In other words, Deep Blue’s highly publicized chess matches were precursors to more important developments and greater, longer lasting results.
Predicting the future is usually a chump’s game but the new solutions and partnerships IBM highlighted this week in San Francisco suggest a similar future is in store for Watson. The company is certainly making good on its promise to put Watson to work by delivering additional, innovative tools that developers and companies require to create new, increasingly capable Watson-based cognitive solutions.
The fact that those offerings run the gamut from simplifying choosing wine to enhancing online marketing to personalizing healthcare to analyzing massive amounts of legal and pharmaceutical data speaks to the flexibility and effectiveness of IBM’s technologies. From what I saw in San Francisco, the revolution in data analysis appears well-underway. IBM obviously isn’t the only IT vendor attempting to democratize information access but it fully intends that Watson will be leading the global charge.
© 2015 Pund-IT, Inc. All rights reserved.