Microsoft’s vision of our future is big screens and big data

What do you get when you cross a massive touchscreen, a Kinect, some sort of computing device and a whole lot of data-processing technology? You get Microsoft’s (s msft) vision of our digital future, both at home and in the office. The company’s struggles to shift from a desktop-and-server-based world to a mobile-and-cloud-based world shed some doubt on whether Microsoft will be the company to actually deliver on its vision, but it definitely gets points for trying.
In an earlier post, I explained how the company is betting on its investment in machine learning and the webscale infrastructure that powers its Bing search engine to fuel its future devices and services. When you consider that any sort of gesture, speech or handwriting-recognition technology incorporates machine learning in order to decipher what human beings are actually doing, saying or writing, it’s easy to see how busy Microsoft has been.
Some of the technology was downright impressive if not occasionally mind-blowing (in theory, at least), while other parts — particularly some of the futuristic tableaus involving hyper-interactivity — seem, frankly, a little annoying. (You can see some examples in the gallery below.)

Showing off: Consumer tech

Microsoft used its TechForum media event to show everything from a speech-recognizing, dual-screen Xbox Live interface (and to hint at forthcoming original, possibly interactive programming from Microsoft) to a research project that allows Kinect to recognize hand gestures in addition to broad movements. The latter would allow for new ways of controlling Kinect-connected devices without controllers or specialized gear (e.g., fake guns) because a gripping (and releasing) motion would replace the press of a button.
Rick Rashid, Microsoft’s chief research officer, said the company used Kinect to teach an elevator in one building when people are planning to board or just standing in front of it conversing. If the elevator senses that someone wants to get on, the door opens automatically without requiring the person to press a button.
And, of course, pretty much all of Bing is the result of machine learning and big data technologies. Qi Lu, president of the Online Services Division, showed off all sorts of Bing capabilities, including one that surfaces reviews or other information from respected voices in the specific fields about which a user is searching. He said that capability requires searching “trillions of pieces of web data” to determine who’s influential and whether their content is of high quality.
The whole event took place in Microsoft’s new Envisioning Center, which tries to simulate what our work and home lives will be like a decade or more out. What’s Microsoft’s vision? Everything — from the LED bulbs to the omnipresent screens to the stove burners — is digital, connected and equipped with sensors, and 3-D printers sit on every desk. It’s everything the 8 million apps we have today are trying to be, only, well, convenient.

Showing off: Enterprise tech

For the business users, Microsoft demonstrated the machine-learning-based Flash Fill feature in Excel 2013, as well as a research project for indexing structured data sets available on Bing and then having Excel automatically recommend them based on the data someone is already working with. PowerPivot, a BI extension for Excel, can now handle 100 million rows of data in-memory versus its old limit of 1 million rows — allowing users to prove models before deploying on a larger scale using SQL Server and new performance-boosting in-memory, columnar store and compression capabilities.
Microsoft is also working on a workspace environment called DataLab that lets users share data, models, experiments and workflows. Technical Fellow Dave Campbell said the goal is to reduce the need for armies of skilled data experts by letting them publish their work for broader consumption — just like what happened with applications years ago. Some of these — such as Microsoft’s own Synonyms Search tool for figuring out what search terms are often associated with each other — will eventually make their way into the Windows Azure Data Marketplace.
We also saw research projects focused on using machine learning to identify in real time faulty microchip wafers (or, in theory, anything) as they traverse the production line and an application that analyzes petabytes of web data in order to determine how viral content spreads across the web. At Microsoft TechFest event the following day (which I didn’t attend), the company apparently showed off even more research projects in this same vein of fancy visualization and interfaces belying some serious data processing.
On the collaboration front, Microsoft trotted out its Lync video-conferencing platform that also lets users collaborate on documents use virtual ink to write messages. The latter appears particularly useful for creating virtual whiteboards, especially when Perceptive Pixel screens are involved. When the meeting is done, Office Division President Kurt DelBene said, everything saves to the cloud and can be shared with whoever needs it.
If I had one takeaway from TechForum, it’s that I wouldn’t want to be any legacy technology company right now — be it HP, IBM, Sony, Dell or even Apple — trying to ride a skyrocketing innovation curve while also having to maintain a multibillion-dollar collection of legacy businesses. Gun to my head, though, Microsoft wouldn’t be a bad choice.