In the news: IBM has bought text- and image-analysis innovator AlchemyAPI, for inclusion in the Watson cognitive computing platform.
AlchemyAPI sells text analysis and computer vision capabilities that can be integrated into application, services, and data systems via a SaaS API. But I don’t believe you’ll find the words “cognitive computing” on AlchemyAPI‘s Web site. So where’s the fit? What gap was IBM seeking to fill?
For an IBM description of Watson cognitive computing, in business-accessible terms, see the video embedded in this article.
My definition: Cognitive computing both mimics human capabilities — perception, synthesis, and reasoning — and applies human-like methods such as supervised learning, trained from established examples, to discern, assess, and exploit patterns in everyday data. Successful cognitive computing is also superhuman, with an ability to apply statistical methods to discover interesting features in big, fast, and diverse data. Cognitive platforms are scalable and extensible, able to assimilate new data and methods without restructuring.
AlchemyAPI fits this definition. The automated text understanding capabilities offered by AlchemyAPI and competitors — they include TheySay, Semantria, Ontotext, Pingar, MeaningCloud, Luminoso, Expert System, Datumbox, ConveyAPI, Bitext, Aylien, and others, each with its own strengths — add value to any social or enterprise solution that deals with large volumes of text. I haven’t even listed text analysis companies that don’t offer an on-demand, as-a-service option!
AlchemyAPI uses a hybrid technical approach that combines statistical, machine learning, and taxonomy-based methods, adapted for diverse information sources and business needs. But what sets AlchemyAPI apart is the company’s foray into deep learning, the application of a hierarchy of neural networks to identifying both broad-stroke and detailed language and image features.
So AlchemyAPI isn’t unique in the natural-language processing (NLP) domain, but the company does have lasting power. The success is measurable. AlchemyAPI, founded in 2005, was relatively early to market with an on-demand text analysis service and has won an extensive developer following although I’ll bet you $1 that the widely circulated 40,000 developer figure counts API-key registrations, not active users. The company is continually rolling out new features, which range from language detection and basic entity extraction to some of the most fine-grained sentiment analysis capabilities on the market. By contrast, development of the most notable early market entrant, OpenCalais from Thomson Reuters, stalled long ago.
Agility surely plays a role in AlchemyAPI’s success, management foresight that led the company to jump into computer vision. CEO Elliot Turner described the opportunity in an April, 2014 interview:
“Going beyond text, other areas for big progress are in the mining of audio, speech, images and video. These are interesting because of their incredible growth. For example, we will soon see over 1 billion photos/day taken and shared from the world’s camera phones. Companies with roots in unsupervised deep-learning techniques should be able to leverage their approaches to dramatically improve our ability to correctly identify the content contained in image data.”
Yet there’s competition in image analysis as well. Given work in sentiment analysis, most of the companies I follow apply the technology for emotion analytics — they include Affectiva, Emotient, Eyeris, and RealEyes — but consider that Google couldn’t build a self-driving car without technology that “sees.” The potential impact of computer vision and automated image analysis seems limitless, with plenty of opportunity to go around.
Why did IBM, a behemoth with immense research capabilities, need to go outside by acquiring AlchemyAPI? I’d speculate that IBM’s challenge is one that share by many super-large companies: Inability to effectively commercialize in-house innovation. Regardless, the prospect of bringing onto the Bluemix cloud platform all those NLP-interested developers, whether 40,000 or some lesser active number, was surely attractive. The AlchemyAPI technology will surely plug right in: Modern platforms accommodate novelty. As I wrote above, they’re able to assimilate new data and methods without restructuring.
And Watson? It’s built on the IBM-created Apache UIMA (Unstructured Information Management Architecture) framework, designed for functional extensibility. AlchemyAPI already fits in, via a set of “wrappers” that I expect will be updated and upgraded soon. But truth is, it seems to me that given Watson’s broad and proven capabilities, these added capabilities provide only a relatively small technical boost, in two directions. First, AlchemyAPI will provide market-proven unsupervised learning technology to the Watson stack, technology that can be applied to diverse language-understanding problem. Second, as I wrote, AlchemyAPI offers some of the most fine-grained sentiment analysis capabilities on the market, providing certain information-extraction capabilities not currently closely linked to Watson. What IBM will do with AlchemyAPI’s image-understanding capabilities, I can’t say.
Beyond these technical points, I’m guessing that the bottom-line attractions were talent and opportunity. IBM’s acquisition press release quotes AlchemyAPI CEO Elliot Turner: “We founded AlchemyAPI with the mission of democratizing deep learning artificial intelligence for real-time analysis of unstructured data and giving the world’s developers access to these capabilities to innovate. As part of IBM’s Watson unit, we have an infinite opportunity to further that goal.” It’s hard to beat infinite opportunity or, for a company like IBM, a chance to build on a combination of agility, talent, enthusiasm, market-sense, and foresight that is hard to find in house or in the commercial marketplace.
Disclosure: I have mentioned numerous companies in this article. AlchemyAPI, IBM, Converseon (ConveyAPI), Daedalus (MeaningCloud), Lexalytics (Semantria), Luminoso, Ontotext, and TheySay have paid to sponsor my Sentiment Analysis Symposium conference and/or my Text Analytics 2014 market study and/or the Brussels LT-Accelerate conference, which I co-own.
An extra: Video of a talk, Deep Learning for Natural Language Processing, with Stephen Pulman of the University of Oxford and text-analysis solution provider TheySay, offered at the 2014 Sentiment Analysis Symposium. Deep learning techniques are central to AlchemyAPI’s text and image analysis capabilities as well.