Anticipatory search is already here, but now apps are evolving into anticipatory analysis engines as well, using data from our actions to serve up information before we ask for it.
Microsoft is supposedly turning the fictional virtual assistant from its Halo games into an an actual virtual assistant that Windows users can engage with across phone, PC and Xbox platforms.
Sherpa is trying to build a more flexible virtual assistant technology that can easily adapted for new tasks. To that end it has developed its own conceptual meta-language which it uses to process all voice commands.
Now that touch input has revolutionized the mobile device market, what’s next? Voice interaction and natural language processing (NLP) surely have to be high on the list. But it’s not always easy for app developers to enable voice features. Enter OneTok, a cloud-based NLP solution.
T-Mobile is giving its rather pathetic MyTouch voice-command feature a much-needed overhaul. It’s incorporating the same semantic-search technology Nuance uses in Dragon Go into Genius, allowing the voice assistant to search over 200 different content providers and understand intent rather than just words.
Look out Nuance. there’s a new speech recognition player in town, AT&T. Ma Bell has taken the locks off of its Watson speech application programming interfaces, allowing any developer to use them to add voice commands and natural language understanding to their apps.