Enchanting Products and Spaces by Rethinking the Human-Machine Interface

At the Gigaom Change conference in Austin, Texas, on September 21-23, 2016, David Rose (CEO of Ditto Labs, MIT Media Lab Researcher and author of Enchanted Objects), Mark Rolston (Founder and Chief Creative Officer at argodesign) and Rohit Prasad (Vice President and Head Scientist, Alexa Machine Learning) spoke with moderator, Leanne Seeto, about “enchanted” products, the power of voice-enabled interactions and the evolution of our digital selves.
There’s so much real estate around us for creating engaging interfaces. We don’t need to be confined to devices. Or at least that is the belief of Gigaom Change panelists, David Rose, Rohit Prasad and Mark Rolston, who talked about the ideas and work being explored today that will change the future of human-machine interfaces creating more enchanted objects in our lives.
With the emergence of Internet of Things (IoT) and advances in voice recognition, touch and gesture-based computing, we are going to see new types of interfaces that look less like futuristic robots and more like the things we interact with daily.
Today we’re seeing this happen the most in our homes, now dubbed the “smart home.” Window drapes that automatically close to give us privacy when we need it is just one example of how our homes and even our workspaces will soon come alive with what Rose and Rolston think of as Smart-Dumb Things (SDT). One example might be an umbrella that can accurately tell you if or when it’s going to rain. In the near future devices will emerge out of our phones and onto our walls, furniture and products. We may even see these devices added to our bodies. This supports the new thinking that devices and our interactions with them can be a simpler, more seamless and natural experience.
Rose gave an example from a collaboration he did with the architecture firm Gensler for the offices of Salesforce. He calls it a “conversational balance table.” It’s a device that helps subtly notify people who are speaking too much during meetings. “Both introverts and extraverts have good ideas. What typically happens, though, is that during the course of a meeting, extraverts take over the conversation, often not knowingly,” Rose explains, “so we designed a table with a microphone array around the edge that identifies who is speaking. There’s a constellation of LEDs embedded underneath the veneer so as people speak, LEDs illuminate in front of where you are. Over the course of 10 or 15 minutes you can see graphically who is dominating the conversation.”
So what about voice? Will we be able to talk to these devices too? VP and Head Scientist behind Amazon Alexa, Rohit Prasad, is working on vastly improving voice interactions with devices. Prasad believes voice will be the key feature in the IoT revolution that is happening today. Voice will allow us to access these new devices within our homes and offices more efficiently. As advances in speech recognition continue, voice technology will become more accurate and able to quickly understand our meaning and context.
Amazon is hoping to spur even faster advances in voice from the developer community through Alexa Skills Kit (ASK) and Alexa Voice Service (AVS), which allow developers to build voice-enabled products and devices using the same voice service that powers Alexa. All of this raises important questions. How far does this go? When does voice endow an object with the attributes of personhood? That is, when does an object become an “enchanted” object?
At some point, as Mark Rolston of argodesign has observed, users are changed in the process of interacting with these objects and spaces. Rolston believes that our digital selves will evolve into entities of their own — what he calls our “meta me,” a combination of both the real and the digital you. In the future Rolston sees our individual meta me’s as being more than just data, but actually negotiating, transacting, organizing, and speaking on our behalf.
And while this is an interesting new concept for our personal identity, what is most interesting is using all of this information and knowledge to get decision support on who we are and what we want. The ability for these cognitive, connected applications to help us make decisions in our life is huge. What we’re moving toward is creating always-there digital companions to help with our everyday needs. Imagine the future when AI starts to act as you, making the same decisions you would make.
As this future unfolds, we’re going to begin to act more like nodes in a network than simply users. We’ll have our own role in asking questions of the devices and objects around us, telling them to shut off, turn on, or help us with tasks; gesturing or touching them to initiate some new action. We’ll still call upon our smartphones and personal computers, but we won’t be as tethered to them as our primary interfaces.
We’ll begin to call on these enchanted devices, using them for specific tasks or even in concert together. When you ask Amazon’s Echo a simple question like “what’s for lunch?” you won’t be read a lengthy menu from your favorite restaurant. Instead, your phone will vibrate letting you know it has the menu pulled up for you to scroll through and decide what to eat. Like the talking candlestick and teapot in Beauty and The Beast, IoT is going to awaken a new group of smart, interconnected devices that will forever change how we interact with our world.
By Royal Frasier, Gryphon Agency for Gigaom Change 2016