Expect Labs upgrades its MindMeld API for building smart apps

Expect Labs has amped up the capabilities of its MindMeld API for building Siri-like applications for smartphones, browsers or pretty much any device. New features include support for seven new languages, SDKs for Android, iOS and JavaScript, and cross-platform speech recognition.

The company released the API publicly in February, promising to power apps that listen to speech commands or questions and deliver meaningful content based on relevance and context. Expect Labs’ first product, an “anticipatory computing” app called MindMeld, uses the API to listen to voice conversations and surface relevant web content as users speak — sometimes predicting what they’ll say next.

The flow a MindMeld application.

The flow a MindMeld application.

The MindMeld API is one of a number of recent, and varying, attempts to put artificial intelligence in the hands of mobile developers. Others include IBM’s Watson cognitive computing API, AlchemyAPI’s natural-language processing and computer vision services, and Jetpac’s DeepBelief neural network SDK for object recognition.

Expect Labs Co-founder and CEO Tim Tuttle (pictured above) spoke about the future of pervasive search and intelligent devices at our Structure Data conference in March. That video is embedded below.