Voices in AI – Episode 74: A Conversation with Dr. Kai-Fu Lee


About this Episode

Episode 74 of Voices in AI features host Byron Reese and Dr. Kai-Fu Lee discussing the potential of AI to disrupt job markets, the comparison of AI research and implementation in the U.S. and China, as well as other facets of Dr. Lee’s book “AI Superpowers”. Dr. Kai-Fu Lee, previously president of Google China, is now the CEO of Sinovation Ventures.
Visit www.VoicesinAI.com to listen to this one-hour podcast or read the full transcript.

Transcript Excerpt

Byron Reese: This is Voices in AI, brought to you by GigaOmI’m Byron Reese. Today I am so excited my guest is Dr. Kai-Fu Lee. He is, of course, an AI expert. He is the CEO of Sinovation Ventures. He is the former President of Google China. And he is the author of a fantastic new book called “AI Superpowers.” Welcome to the show, Dr. Lee. 
Kai-Fu Lee: Thank you Byron.
I love to begin by saying, AI is one of those things that can mean so many things. And so, for the purpose of this conversation, what are we talking about when we talk about AI?
We’re talking about the advances in machine learning… in particular Deep Learning and related technologies as it applies to artificial narrow intelligence, with a lot of opportunities for implementation, application and value extraction. We’re not talking about artificial general intelligence, which I think is still a long way out.
So, confining ourselves to narrow intelligence, if someone were to ask you worldwide, not even getting into all the political issues, what is the state of the art right now? How would you describe where we are as a planet with narrow artificial intelligence?
I think we’re at the point of readiness for application. I think the greatest opportunity is application of what’s already known. If we look around us, we see very few of the companies, enterprises and industries using AI when they all really should be. Internet companies use AI a lot, but it’s really just beginning to enter financial, manufacturing, retail, hospitals, healthcare, schools, education and so on. It should impact everything, and it has not.
So, I think what’s been invented and how it gets applied/implemented/monetized… value creation, that is a very clear 100% certain opportunity we should embrace. Now, there can be more innovations, inventions, breakthroughs… but even without those I think we’ve got so much on our hands that’s not yet been fully valued and implemented into industry.
Listen to this one-hour episode or read the full transcript at www.VoicesinAI.com
Byron explores issues around artificial intelligence and conscious computers in his new book The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity.

The 2017 State of the Services Economy Report

I’m happy to announce the release of the 2017 State of the Services Economy Report that I worked on for the past six months with Mavenlink, one of the best reports I worked on in the past several years. Visit this page to download the full report.

I co-authored the report with Ray Grainger, the CEO of Mavenlink. We were supported by a great team, and supported by ResearchNow. We also had great contributions from my research panel: David Coleman, Steven Fisher, Martin Gaedke, Maddie Grant, Jean Russell, Brian Solis, and Joachim Stroh.

We started with some core premises for the research. We believed that the rate of change in the service economy was very steep, and that turned out to be the case: Over three quarters (78.3%) of our survey’s respondents state business conditions are changing quickly, and 20% say it’s faster than ever before.

This sets the deep background for everything else discussed in the report: an unrelenting pressure to adapt to a rapidly changing business context, one that — at least for some — is changing as quickly as it ever has.

A major trend in services companies is the rapid transition to project-based work, away from retainer-based models, and the bigger the company, the quicker the transition. This transition seems tightly linked to the need for greater agility and flexibility by services companies’ clients.

We expected that the best defense against the onslaught of technology-induced change would be… more technology. 70% of those surveyed say they are adopting new technologies, and only a small group are holding out against new technologies.

The third big bang from the research has to do with competition. Times of great change can lead to increases in the level of competition, and our survey confirmed that.

We asked if companies are seeing increased competition:

  • 62% said they are.
  • Of those that are seeing more competition, 26.8% says it’s coming from existing competitors,
  • 36.6% indicate it’s coming from new entrants, and
  • 36.6% say it’s a combination of the two.

That means the greatest competitive threat is coming from new entrants, in general.

So, the hard bottom line: accelerating change, transition to project-based economics, more defensive and offensive technology, and more competition.

Welcome to the accelerating services economy.

This research was sponsored by Mavenlink, but the opinions stated are my own. Originally published at www.stoweboyd.com.

What Is a Research and Analysis Company For, Today?

I’m interested in directing my efforts today — and those of the analysts and others who are joining me here, at the new Gigaom Research — toward organizing something new, something better suited to the times we are living and working in.Here’s the understatement of the year: These have been a very fast-paced few months. In early July, I signed up with the new Gigaom Research as the first and only analyst working here, and now — just over two months later — I’ve been joined by over 20 other analysts and researchers, signed up a growing list of clients, and started the process of building a new and substantially different Gigaom Research.
As an aspect of that project, I’ve taken on the role of Managing Director of Gigaom Research. Leaving aside the issue of how the old defunct Giga Omni Media was managed, I’m interested in directing my efforts today — and those of the analysts and others who are joining me here, at the new Gigaom Research — toward organizing something new, something better suited to the times we are living and working in, as well as remaining sustainable, agile, and profitable.
To attack that, I decided to first answer a core question: what is a research and analysis company for, today?
On a superficial level, answering that question for Gigaom Research is simple, although multi-faceted:

  1. Gigaom Research is focused on helping business leaders grapple with the implications of emerging technologies on their companies. To accomplish that, we need deep expertise in those technologies, an up-to-date understanding of their application in business, and an appreciation of the organizational and workforce dimensions of technology adoption. While we aren’t in the management consulting business, guiding customers through digital transformation and the like, we must understand the forces and fissures associated with technologically-induced change and innovation.
  2. Gigaom Research works closely with technology vendors developing the emerging technologies that define our world, providing research on markets and trends driving adoption, and the needs and challenges of businesses that are the buyers and users of these tools. While we aren’t in the business of designing and building hardware, software or everything in between, we need to stay very close to the bleeding edge of technological innovation, and therefore can advise and provide market intelligence to our vendor clients.

Therefore, there are two complementary sides to Gigaom Research: one working with vendors creating the tools and technology platforms that define our world, and another working with companies increasingly reliant on those tools, technologies, and techniques to apply them in their businesses.
We plan to transition away from a subscription-based business model toward one based on dialog, interaction, and participation.However, the Gigaom Research of old was based on a publishing model: writing reports — and to a lesser extent, blogging — about technologies, tools, and their impacts on business. By early 2014 it was becoming clear that the larger part of Gigaom Research’s value for vendors was in the application of our analysts’ understanding of markets and trends to vendor’s plans and strategies. Although the old Gigaom Research had an offering for business clients, called Buyers Lens, it was not much more than a subscription to our basic research, with perhaps a small amount of inquiry time included. And most important: such reports are immediately out of date as soon as published. As a result, we plan to transition away from a subscription-based business model toward one based on dialog, interaction, and participation. I will be writing more about that next month.
On both sides of our business model, we will be working to create greater opportunities for direct involvement with clients:

  • On one side, we are planning to launch the Gigaom Research Council, a community for business leaders whose responsibilities involve understanding the implications of emerging technologies in their companies. These are change agents, heads of innovation and digital, along with others revamping their organizations to better compete in a time of accelerating volatility, uncertainty, and ambiguity. This effort will be led by my colleague and long-time contributor Larry Hawes, with a great deal of support from me and other analysts.
  • On the vendor side, we will be bringing our deep awareness of what’s going on in the workforce and workplace to guide vendors in their efforts to bring new offerings to market, expand the adoption of today’s tools, and to create more effective connections with their users and markets.

Simply writing dozens of reports — however insightful and well-targeted — will not get our clients where they want and need to be.These both will require us to provide new ways to connect and communicate with clients. Simply writing dozens of reports — however insightful and well-targeted — will not get our clients where they want and need to be.
I’ve already written about how we are going to be moving away from statically-defined and siloed technology-centric focus areas for research (see Rebooting Gigaom Research). In the next iteration of our website — and in the way we are already reorganizing our services — we won’t be slotting our work into ‘mobile’ versus ‘Internet of things’ versus ‘Data’. Instead we will be developing a cascade of trends and issues, and tagging our work appropriately. For example, I’ve talked with a number of clients recently about the apparent rebound of email as a platform for communication and technology innovation. That thread will be — on the coming-soon revamped website — tagged (and searchable) as #email-as-a-platform, under the larger theme of #cotech, which is a term I am introducing for technologies that support coworking (all the ‘co’ words: ‘coordination’, ‘collaboration’, ‘coauthoring’, ‘coediting’, ‘cooperating’, ‘cocurating’ and so on). And instead of seeking to write one or two reports on that topic, instead we would be involved in much more active and ‘living’ research. For example, we might have a few analysts participate in an online and open demo of a tool that leverages email as a platform, and we’d publish the discussion of their thoughts on that tool and those ideas. That might lead to discussions with vendors on one side, and members of the Research Council on the other. A week later we might decide to run a survey in the Council on some of those ideas, and we’d publish a Trend Brief — one of our short reports — on the findings. And that might lead to a cascade of other interactions, writing, and analysis.
All of this is significantly different from the old Gigaom Research, but such cascades of activities will become the modus operandi of the new Gigaom Research, going forward.
This is, then, a call to action.
For existing or prospective business clients who are looking for a new, deeper, and more open approach to leveraging analysis and research for their companies, we are eager to engage. Please start with this request for more information, so that we can initiate a dialogue about your company’s needs and how they might align with our new Gigaom Research Council.
For current or future vendor clients, we welcome a chance to talk, to develop a closer strategic relationship, and to work jointly to triangulate market movements and trends. Please contact me to open that discussion. I’ve spoken with dozens of vendor representatives — CEOs, head of analyst relations, CMOs, product leads — in the past few months, and I have learned a great deal through those calls and meetings. I welcome the chance to learn more.
We are going to need a new cadre of aggressive and dissatisfied analysts to join us.For analysts interested in the opportunities at Gigaom Research, we are recruiting. I’ve been fortunate in that so many of the most insightful and active analysts from the old Gigaom Research have been willing to sign up with the new Gigaom Research, but we are going to need a new cadre of aggressive and dissatisfied analysts to join us, as well.
First of all, we’ve adopted a much more researcher-centric operational model, and at this point we have not brought aboard dedicated business development or sales staff. But the new Gigaom Research will need highly motivated and deeply knowledgeable research leads and research directors: practitioners with the expertise and skills to be able to work closely with our clients, to deliver value in this new regime, and to be able to close and manage engagements.
We are rebooting Gigaom Research in an era that demands a new model of operations, one better suited to the times, forces and trends that are shaping the markets and economics that confront our clients.But honestly, if you are interested in a 9-5 analyst job — an ivory tower gig updating last years’ report using a five-year old analytic framework — please don’t even bother. If you contact us, be hungry, and please start by making a list of companies and contacts you think are candidates for Gigaom Research services, lay out a plan of engagement, and we’ll open a mutual dialog with them as soon as practical.
The bottom line: we are rebooting Gigaom Research in an era that demands a new model of operations, one better suited to the times, forces and trends that are shaping the markets and economics that confront our clients. I welcome your involvement, whether you are a change agent at a Fortune 1000 corporation, a product manager at innovative software start-up, or a underutilized and hungry researcher at one of our competitors.
Let’s get busy.

Research Agenda of Larry Hawes, Lead Analyst

Greetings! As my colleague Stowe Boyd announced yesterday, I am part of a fabulous group of smart, well-respected people that have joined the rebooted Gigaom Research as analysts. I was affiliated with the original version of Gigaom Research as an Analyst, and am very pleased to be taking the more involved role of Lead Analyst in the firm’s new incarnation, as detailed in Stowe’s post.
For those of you who don’t know me, I’ve spent the last 16 years working as a management and technology consultant, enterprise software industry analyst, writer, speaker and educator. My work during that time has been focused on the nexus of communication, collaboration, content management and process/activity management within and between organizations ─ what I currently call ‘networked business’.
I intend to continue that broad line of inquiry as a Lead Analyst at Gigaom Research. The opportunity to work across technologies and management concepts ─ and the ability to simultaneously address and interrelate both ─ is precisely what makes working with Gigaom Research so attractive to me. The firm is fairly unique in that aspect, in comparison to traditional analyst organizations that pigeonhole employees into discrete technology or business strategy buckets. I hope that our customers will recognize that and benefit from the holistic viewpoint that our analysts provide.
With the above in mind, I present my research agenda for the coming months (and, probably, years). I’m starting at the highest conceptual level and working toward more specific elements in this list.

Evolution of Work

Some analysts at Gigaom Research are calling this ‘work futures’. I like that term, but prefer the ‘evolution of work’, as that allows me to bring the past and, most importantly, the current state of work into the discussion. There is much to be learned from history and we need to address what is happening now, not just what may be coming down the road. Anyway, this research stream encompasses much of what I and Gigaom Research are focused on in our examination of how emerging technologies may change how we define, plan and do business.

Networked Business

This is a topic on which I’ve been writing and speaking since 2012. I’ve defined ‘networked business’ as a state in which an interconnected system of organizations and their value-producing assets are working toward one or more common objectives. Networked business is inherently driven by connection, communication and collaboration, hence my interest in the topic.
While the concept of networked business is not new, it has been gaining currency in the past few years as a different way of looking at how we structure organizations and conduct their activities. As I noted in the first paragraph of this post, there are many technologies and business philosophies and practices that support networked business, and I will do my best to include as many as possible in my research and discussions.

Networks of Everything

This research stream combines two memes that are currently emerging and garnering attention: the Internet of Things and the rise of robots and other intelligent technologies in the workplace. In my vision, networks of everything are where humans, bots, virtual assistants, sensors and other ‘things’ connect, communicate and collaborate to get work done. The Internet, Web, cellular and other types of networks may be used in isolation or, more likely, in combination to create networks of everything.
I’ve had a book chapter published on this topic earlier this year, and I’m looking forward to thinking and writing more about it in the near future.


How do we build applications that can support business in a heavily networked environment? While the idea of assembling multiple technology components into a composite application are not new (object-oriented programing and Service Oriented Architecture have been with us for decades), the idea continues to gain acceptance and become more granular in practice.
I intend to chronicle this movement toward microservices and discuss how the atomization of component technology is likely to play out next. As always, my focus will be on collaboration, content management and business process management.

Adaptive Case Management and Digital Experience Management

These two specific, complementary technologies have also been gathering more attention and support over the last two years and are just beginning to hit their stride now. I see the combination of these technologies as an ideal enabler of networked business and early exemplars of component architecture at the application level, not the microservice one (yet).
I’ve written about ACM more, but am eager to expand on the early ideas I’ve had about it working together with DEM to support networked business.

Work Chat

Simply put, I would be remiss to not investigate and write about the role of real-time messaging technology in business. I’ve already called work chat a fad that will go away in time, but it needs to be addressed in depth for Gigaom Research customers, because there are valid use cases and it will enjoy limited success. I will look at the viability of work chat as an extensible computing platform, not just as a stand-alone technology. Fitting with my interest in microservices, I will also consider the role that work chat can play as a service embedded in other applications.
Phew! I’m tired just thinking about this, much less actually executing against it. It’s a full plate, a loaded platter really. The scariest thing is that this list is likely incomplete and that there are other things that I will want to investigate and discuss. However, I think it represents my research and publishing interests pretty  well.
My question is, how does this align with your interests? Are there topics or technologies that you would like to see me include in this framework? If so, please let me know in a comment below. Like all research agendas, mine is subject to change over time, so your input is welcomed and valued.

Full duplex may be the next breakthrough in mobile networking

Stanford startup Kumu Networks didn’t receive much notice at Mobile World Congress this week as the giants of the mobile industry revealed their plans for 2015, but it did get the attention of two rather important mobile carriers. At their separate booths, Telefónica and SK Telecom were showing off a Kumu-built radio transmission system called full duplex, which both carriers said could eventually become one of the key technologies of any future 5G standard.

When the mobile companies pull out the 5G card, they’re usually trying to signal that something is a really big deal, and in the case of Kumu, they could very well be right. What full duplex does is solve a fundamental problem in wireless communications that limits a network’s full capacity potential: the inability to transmit and receive signals on a radio channel at the same time. The problem is known as self-interference, but the concept is not quite as complex as it sounds.


Imagine two people are having a conversation, which itself is one of the simplest two-way — or duplex — communication channels. If both people are talking at the same time, neither one can understand what the other is saying. The words one person speaks get drowned out by the other’s voice before it ever reaches his ears. The same principle holds for wireless transmissions. When a radio is transmitting its signals bleed over into its own receiver interfering with the signals it’s trying to listen for.

For that reason wireless networks have always been built in something called half-duplex mode, which basically prevents them from ever transmitting and receiving in the same channel at the same time. It’s why most mobile networks in the world today use different sets of frequencies for downlink and uplink transmissions (For instance in many U.S. LTE systems, our devices receive data from the tower in a 2100 MHz channel, but they send information back at 1700 MHz). And it’s why a Wi-Fi router flip-flops between transmitting and receiving when it talks to your laptop or smartphone. Half-duplex has served the wireless industry well, but using it means you’re only using half of the total capacity of your airwaves at any given time.

Kumu Networks is based in Santa Clara but its roots are in Stanford where its founders started their full duplex research.

Kumu Networks is based in Santa Clara but its roots are in Stanford where its founders started their full duplex research.


As my colleague Signe Brewster wrote in Gigaom’s first look at the Stanford startup in 2013, Kumu claims to have developed the mathematical breakthrough necessary to solve the problem of self-interference at a practical level. And now it’s claiming to have produced a commercially viable full-duplex radio system that can transmit and receive simultaneously without turning its connection to mush. According to Kumu VP of product development Joel Brand, the company accomplished this by becoming a very smart listener.

Essentially Kumu is constantly scanning the radio environment, gauging the exact state of the airwaves at any given time, Brand said. Using internally developed algorithms, Kumu can “hear” how the transmission the radio is pumping out is changing the signal environment a the receiver. It can then compensate for those changes as signals heading the opposite direction arrive. It’s like echo cancellation applied to radio waves instead of sound.

Full Duplex demo

Kumu supplied some photos of the full duplex rig it demoed at Mobile World Congress, and I’ll be the first to admit it doesn’t look very impressive. But at MWC I asked Vish Nandlall, CTO of Australian multinational mobile carrier [company]Telstra[/company], about the technology, and he said it was the real deal. Full duplex isn’t some crazy new concept Kumu just made up one day, he said. Full duplex is used today in regular phone lines, and its application to wireless has been kicking around scientific papers and academic research labs for some time. But what Kumu did was come up with a viable technology that could be applied to real world networks, Nandlall said.

The impact could be quite significant. If you remove the self-interference barrier, carriers could use all of their spectrum for both uplink and downlink at the same time, which would double the capacity or double the number of connections any network could support. Wi-Fi networks would no longer have to alternate between sending data and receiving it, thus dramatically improving their download and upload speeds. It might not solve the so-called spectrum crunch, but it would go a long way to making wireless networks a lot more efficient.

Right now Kumu is pitching the technology to carriers as a backhaul system, so they could use their 4G spectrum to concurrently communicate with phones and the core network. But Brand says in the future full duplex can easily be applied to the access network connecting our devices. In fact, Kumu’s MWC demos were using off-the-shelf radio smartphone chips from [company]Qualcomm[/company], just with the duplexer ripped out. That kind of change would require a redesign of both our networks and our devices, which isn’t going to happen overnight. That’s why Kumu and its carrier partners [company]Telefónica[/company] and [company]SK Telecom[/company] are looking ahead to 5G.


Scientists say tweets predict heart disease and community health

University of Pennsylvania researchers have found that the words people use on Twitter can help predict the rate of heart disease deaths in the counties where they live. Places where people tweet happier language about happier topics show lower rates of heart disease death when compared with Centers for Disease Control statistics, while places with angry language about negative topics show higher rates.

The findings of this study, which was published in the journal Psychological Science, cut across fields such as medicine, psychology, public health and possibly even civil planning. It’s yet another affirmation that Twitter, despite any inherent demographic biases, is a good source of relatively unfiltered data about people’s thoughts and feelings, well beyond the scale and depth of traditional polls or surveys. In this case, the researchers used approximately 148 million geo-tagged tweets from 2009 and 2010 from more than 1,300 counties that contain 88 percent of the U.S. population.

(How to take full advantage of this glut of data, especially for business and governments, is something we’ll cover at our Structure Data conference with Twitter’s Seth McGuire and Dataminr’s Ted Bailey.)


What’s more, at the county level, the Penn study’s findings about language sentiment turn out to be more predictive of heart disease than any other individual factor — including income, smoking and hypertension. A predictive model combining language with those other factors was the most accurate of all.

That’s a result similar to recent research comparing Google Flu Trends with CDC data. Although it’s worth noting that Flu Trends is an ongoing project that has already been collecting data for years, and that the search queries it’s collecting are much more directly related to influenza than the Penn study’s tweets are to heart disease.

That’s likely why the Penn researchers suspect their findings will be more relevant to community-scale policies or interventions than anything at an individual level, despite previous research that shows a link between emotional well-being and heart disease in individuals. Penn professor Lyle Ungar is quoted in a press release announcing the study’s publication:

“We believe that we are picking up more long-term characteristics of communities. The language may represent the ‘drying out of the wood’ rather than the ‘spark’ that immediately leads to mortality. We can’t predict the number of heart attacks a county will have in a given timeframe, but the language may reveal places to intervene.”

The researchers’ work is part of the university’s Well-Being Project, which has also used Facebook users’ language to build personality profiles.

map plot - FINAL

FCC starts poking around for future 5G airwaves

5G research is gaining momentum worldwide, and it will require new spectrum. Enter the FCC. Chairman Tom Wheeler wants to start investigating the future cellular technology from a regulatory standpoint.