A new device that can efficiently and cheaply turn waste heat into electricity is now on sale for all those big energy companies working in ultra remote locations.
This is Jonathan Koomey’s fourth essay in a series of four this week that highlights, and excerpts from, his upcoming book, “Cold Cash, Cool Climate,” which discusses how entrepreneurs and investors can profit from tackling climate change, one of this century’s greatest challenges.
Who says a solar company has to choose between conventional silicon and thin-film solar cells? RoseStreet Labs Energy, a Phoenix-based private company, is combining the two in a double-layered cell that it claims can achieve “practical efficiencies” – meaning efficiencies of cells actually available on the market, not just in the lab – of 25 to 30 percent. On Monday, the company announced the world’s first (or as RoseStreet put it, the “first known”) nitride/silicon tandem solar cell, which it plans to produce in the fourth quarter of next year.
The potential efficiency might not sound breathtaking considering that the National Renewable Energy Laboratory and the Defense Advanced Research Projects Agency both announced last year that they had produced cells that achieved more than 40 percent efficiency in the lab. But lab efficiencies and production efficiencies are not the same thing, and the highest efficiencies for mass-produced solar cells hover around 22 percent. SunPower Corp., which introduced 22-percent efficiency monocrystalline cells in 2007, last year announced it had produced a prototype with 23.4 percent efficiency, which it expects to launch commercially next year.
Read More about A Solar Hybrid: Crystalline Silicon and Thin Film in One Cell
We’ve been following the Always Innovating Touch Book since early this year, and now the web tablet with the detachable keyboard is supposedly shipping, although we haven’t heard from anyone who’s received one yet. The folks at Always Innovating have posted some screenshots of the Touch Book in action to give a feel for the UI they’ve developed, and it looks pretty sweet. The UI changes when the keyboard is detached in order to make it a touch-friendly system. We’re expecting an evaluation unit at some point; we’ll let you know when it arrives.
This post has been updated multiple times as new stats have come in.
While it may not have topped the Obama Inauguration, as some expected, viewership for Michael Jackson’s memorial service today was immense. Here are some of the initial stats.
Akamai says via email it had its second-largest day ever of total traffic, after the inauguration. The CDN delivered more than 2,185,000 live and on-demand streams, with more than 2 terabits per second during the service. For the inauguration, Akamai had a peak of 7 million active simultaneous streams, again with more than 2 terabits per second traffic.
Between 12 a.m. and 4 p.m. EDT today, CNN.com had 72 million global page views, 10.8 million unique visitors and 8.9 million live video streams, according to Omniture. The site had 781,000 peak concurrent live streams, according to server logs. Update: New total numbers through 5 p.m. are 81 million page views, 11.8 million unique visitors and 9.7 million live video streams. (By contrast, CNN delivered more than 25 million streams in the 12 hours surrounding the Obama inauguration, with 1.3 million concurrent live streams just before Obama’s address.)
However, people did have a lot to say about Jackson, and early reports are showing significant and possibly record-breaking levels of interaction with live video feeds. Facebook reported 300,000 users logged in through its integration with CNN.com as of 10:30 a.m. (when the service started), with 500,000 status updates total and approximately 6,000 status updates per minute at that time. For the election, the integration had produced 4,000 status updates per minute, with a peak of 8,500 statuses per minute.
UPDATE: More stats are coming in:
A world in which networks stream feature-length Hulu movies to laptops and virtual worlds to PCs requires a whole lot of power to run the servers, routers, desktop computers and other gear that make it all possible. The electricity used by servers alone doubled between 2000 to 2005 to about 123 billion kilowatt-hours, and if current trends continue, data-center power use is likely to increase another 76 percent by 2010, according to Jonathan Koomey, a researcher at the Lawrence Berkeley National Labs and Stanford University. But in the face of rising power costs and increased attention on fighting climate change, network hardware makers and service providers are starting to work on curbing networks’ energy use.
The first step to overcoming an energy addiction is acknowledging the problem — and in the IT industry that means launching industry standards. Last week, network performance testing company Ixia (XXIA), network gear maker Juniper (JNPR) and Lawrence Berkeley National Labs launched the Energy Consumption Rating (ECR) Initiative, an open standards-based project aimed at creating energy-efficiency metrics for network and telecom devices. It’s one of the first coordinated efforts to develop such metrics and could garner the support of industry players who’ve recently begun their own energy rehab efforts.
The Initiative is welcoming network industry vendors, service providers and other standards bodies to work with it to help institute benchmark metrics for how energy-efficient (or not) network hardware is. The group is working on repeatable measurements to report energy performance in units of “watts per gigabit per second,” a sort of miles per gallon for tech gear. The measurements are some of the most detailed out there and they look at energy performance in different states, from active processing to idle states, and as Ixia’s CEO Atul Bhatnagar said in a phone call, the goal is to be able to set standards that can verify device efficiency through measurement.
The ECR Initiative is still very much a work in progress; it’s more of a “call to action” than a set of instructions on what network makers should do. So far, ECR doesn’t have plans to label gear with an ECR certification to let service operators identify energy-efficient devices for purchase — Ixia’s Bhatnagar said it’s still too soon in the process. More than anything, it’s an acknowledgment that power is a significant issue for the networking and telecom industries.
Both telecom and computing gear companies, as well as service providers, have started to show signs that they’re interested in more energy-efficient networks. Earlier this year, Verizon (VZ) announced its own energy-efficiency standard for telecom gear, aimed at reducing hardware power use 20 percent, industrywide, starting in 2009. Ericsson (ERIC) partnered with a designer to develop a cellular network base station model that would use 40 percent less energy. BT says it’s incorporating energy-efficiency requirements into contracts with suppliers as it builds out next generation networks. On the server front, tech industry behemoths like Google, Yahoo, Sun and Cisco are all working on different ways to reduce data center energy use.
Why are network operators and hardware makers starting to jump on this trend? Largely because of the bottom line. For service providers, using energy-efficient network hardware can be a solid way to cut costs. Google engineers have famously reported that the cost of the power to run data centers is rivaling the cost of buying the data center gear; because the search engine giant uses so much power, it has a major data center efficiency plan underway. Telecom network operators could find similarly positive power savings with energy-efficient network equipment. For the network gear makers, energy-efficient products are a way to differentiate themselves in a largely commoditized industry.
And here’s one silver lining to the recent financial stormclouds: Energy-efficient network technologies will likely get a boost. Networking companies will be looking to cut costs even deeper than in the past, and that means cutting power use.
This article also appeared on BusinessWeek.com
While electronics maker Philips (s PHG) is working on getting its LEDs into homes, the company is also starting to focus on the bigger picture of energy-efficient homes. The Amsterdam-based company announced a new partnership this week with the Lawrence Berkeley National Laboratory to jointly research energy-efficiency solutions for buildings. Improving lighting efficiency using integrated smart wireless devices will be the partnership’s first objective, and consumers could see a commercial product as soon as next year, Philips says.
This is a great partnership that could help deliver the next generation smart home. Philips is one of the world’s largest producers of lighting and has been pushing hard to bring LEDs to the mass market. Meanwhile, Berkeley National Lab has been researching smart energy for homes and buildings at its Environmental Energy Technologies Division.
Nothing they are proposing is that groundbreaking — there have been motion sensors to control lights for years — but this news shows that device makers like Philips are starting to “get it.” Philips is starting with lighting and building systems, but hopefully it won’t take long for them to smarten up their whole line of electronics, everything from the giant flat-screen TVs down to the Sonicare toothbrushes.
This could prove to be both competition and a strong new partner for the startups developing energy efficiency devices for the home. The Dutch electronics giant has been buying up LED companies — perhaps smart energy startups are next.
If you found this post interesting you might also be interested in Earth2Tech’s first Briefing, The Smart Energy Home.
Coming soon to a city near you — more power outages! As temperatures soar across the nation, a report published today in the Journal of Applied Meteorology and Climatology details research from scientists at Lawrence Berkeley National Laboratory, which estimates that electricity demand could outstrip supply by as much as 17 percent on the hottest days in the coming decades. Co-author Norman Miller said in a statement: “Climate warming across the western U.S. could further strain the electricity grid, making brownouts or even rolling blackouts more frequent.” Cue ominous music.
What’s most discouraging about the report is that it doesn’t offer any solutions other than the stock “energy conservation and emissions reductions.” Yawn. Of course those would alleviate many of the problems of global warming, but this report addresses the specific problem of peak power demand. And there are several clean technologies that specifically address peak demand as well.
Read More about Climate Change = More Heat Waves = More Blackouts
Just when I think I’ve seen everything that you can do in Firefox, along comes an add-on like Pencil (Firefox 3 required). By coupling the Gecko drawing engine with the ability to display, save, and load an external canvas, together with a palette of shapes, Pencil installs an entire drawing application into your Firefox browser.
If you’ve worked with Visio or OmniGraffle, you’ll have the general idea: drag shapes to the canvas, move and resize them with the mouse, set their properties. There are basic shapes and a variety of UI widgets available. Pencil supports undo and redo, scaling, rotating, setting the z-order, adding external text and graphics, and so on. If nothing else, this means that you now have a tool for quickly prototyping new UI ideas wherever you have a browser handy.
I wrote about an effort us use millions of specialized embedded processors to build an energy-efficient (relatively) supercomputer that could run at speeds of up to 200 petaflops over at Earth2Tech. The Department of Energy’s Lawrence Berkeley National Laboratory has signed a partnership with chip maker Tensilica to research building such a computer, but after chatting with Chris Rowan, Tensilica’s CEO, I wonder if more specialized computing tasks in the data center might be farmed out to highly customizable — but lower-powered — chips.