Many of us have become incredibly attached to our gadgets, rarely straying far from our phones. Human-gadget interaction is increasingly a mundane and unremarkable part of our lives. Spike Jonze’s movie Her imagines a future in which our attachment to our gadgets enters the territory of friendship and even love. But is love truly possible between humans and machines?
Her follows Theodore Twombly, played by Joaquin Phoenix, as he falls in love with his computer’s artificial intelligence-powered operating system and begins a relationship with her. While the film is a love story, the it offers some enchanting predictions for the technology of our near-future. The voice-activated interfaces that feature prominently in the film are likely to hit the market in 5 to 10 years. Theodore lives in a Los Angeles that is, at most, a few decades into our future.
Imagining truly invisible technology
In this future, technology integrates easily into Theodore’s life. Unlike most futuristic films (for example, Minority Report), the visual experience of technology is not the flashy centerpiece of the movie. As Wired’s Kyle Vanhemert notes, the technology imagined by Her has become so inextricably integrated into the background of our lives that it is essentially invisible.
Samantha (coincidently the default name given to our own virtual assistant when we launched in 2011) easily and smoothly syncs across Theodore’s different hardware devices — his home computer, his work computer, and his smarter-phone. Peripherals such as keyboards and mice have disappeared; almost everything is voice- or motion-activated and hands-free. Technology is so intuitive that we will no longer need to struggle with it.
She really “gets” me
Nonetheless, the invisible integration of technology is only one part of the new user interfaces of Her’s future. Samantha is so effective not only because she works without effort on Theodore’s part, but also because she understands Theodore. That’s the whole point of machine intelligence.
Samantha’s understanding is both literal and non-literal. She is the ultimate dream of natural language processing, understanding the meaning of his words and conversing with Theodore as another human would converse – with tone, inflection, nuance, and emotion. But she also is able to understand context. If Theodore says, “I’m tired,” she knows if he means, “When can we rest?” or “I didn’t sleep well last night” or “Assistant, find me the nearest rest stop.” She is able to infer what Theodore wants without even being prompted. On an excursion, she blind-leads Theodore to pizza, because she knows he will be hungry. She sends his writing to a publisher, knowing what he wants without having to be told.
Because she sounds so human, Samantha is utterly convincing instead of automatically disturbing as a love interest. She is funny, charming, and unnervingly human in her interactions with Theodore. The major dilemma within the movie asks if Samantha is “really” human or not and then asks whether it matters at all.
What’s so funny about machines, love, and understanding?
To look for a relevant example in real life, we can go back to the 1960s. ELIZA, one of the earliest chatbots programmed to engage humans using natural language, attempted to mimic Rogerian psychotherapy, which takes the patients’ language and repeats it back as questions. However, ELIZA clearly did not understand the meaning of the patient’s words, she processed only enough of what her “patient” told her to provide an appropriate response.
Yet many of the users who engaged with ELIZA through text chats were shocked to learn that they were not speaking to a human. And for some people, it didn’t matter. They continued to use ELIZA as a psychotherapy device.
ELIZA succeeded with users despite her limited context. But imagine how powerful a natural language processing tool would be if it were linked to the vast repository of data that is the internet and if powerful search tools and analytics were attached? We might begin to achieve something that’s more Samantha than Siri.
Even so, will we fall in love with this kind of operating system in the future? It’s possible – just as it is possible to love our machines, our cars and other objects already in our lives. And it’s unlikely that technology would “love” us back: the machines won’t need to be sentient in the human sense to perform Samantha-like tasks. What ELIZA suggests is that we humans are comfortable with that understanding being a facade. We just need technology smart enough to mimic understanding — love need not be included.
Ilya Gelfenbeyn is the CEO of Speaktoit, a company that develops natural language virtual assistants for Android, iOS, and Windows Phone