Simon Segars, the new CEO of ARM is speaking at Mobilize. Given ARM’s role as the silicon underpinning of the internet of things and the mobile world, Segars can share what the future will hold.
What kind of amazing new things are going to happen in tech over the next 365 days? Digital design agency Fjord got out its crystal ball and let us have a preview of its annual prediction of the most important themes in tech next year.
With the explosion of low-cost chips, from Intel’s Atom processor to low-power Wi-Fi sensors, just about everything is “getting smart” these days. There are known environmental benefits to this kind of cheap-and-easy digital intelligence, many of them heavily promoted by IBM as part of its “Smarter Planet” initiative. There’s the smart grid, of course, which adds data-rich intelligence to the energy system, but there’s also smart water, smart transportation (including rail, electric vehicles and traffic), and even smart garbage. It’s what Intel regularly describes as the “2 Percent, 98 Percent” rule — that operating IT contributes some 2 percent of global carbon emissions, but IT can be used to minimize the other 98 percent.
But there’s a slightly brownish tinge to all this greener-through-IT talk. The widely cited 2 percent figure only looks at the energy impacts of IT equipment as it’s being operated, not as it’s being manufactured. That’s what’s known as embedded, or embodied, energy. And depending on who you ask, manufacturing can be a major piece of the puzzle — between 75 and 85 percent, according to some research. In 2005, the Silicon Valley Toxics Coalition, an environmental watchdog group for the high-tech industry, estimated that a single fab could consume as much energy as a 60,000-person city. Which begs the question: Will semiconductor manufacturing outweigh the environmental benefits of the “smarter planet”? Read More about Are Smarter Gadgets Really Good for the Planet?
With all the hype about cloud computing, it’s easy to label it as the latest fad, especially when everyone whose application talks Internet is trying to rebrand themselves as a cloud. But the long view shows that this really is an important change, one of several major shifts in computing that have taken place over the last 40 years, each of them driven by costs and shortages.
Once upon a time, computing was expensive. As a result, programmers carried their stacks of punched cards into basements late at night, and ran them on the mainframe. The CPU was always busy; humans were cheap.