Supercomputers these days are compute monsters. IBM’s latest, the Roadrunner, packs the power of 100,000 laptops stacked 1.5 miles high, embraces a unique mix of IBM’s Cell processor and ubiquitous x86 chips from AMD, and has the ability to calculate 1,000 trillion operations every second. Of course, trends in supercomputing generally trickle downstream to the rest of the computer-using population eventually. Continue Reading.
Nvidia and AMD today each launched two graphics chips for the PC market — but the two companies are pursuing divergent strategies. Both share a recent focus on high-end graphics, which underlines how important visual computing has become; but the different approaches taken by each firm may cost Nvidia market share if its monolithic high-end chips can’t deliver the graphic punch to compete with a multi-GPU strategy embraced by AMD and Intel.
Nvidia launched its GTX 280 and GTX 260 chips, which are larger multi-core processors on a single chip. AMD on the other hand, has taken a bottoms-up approach with smaller, multi-core chips that can be harnessed to a second graphics processing chip on a board to deliver higher-level performance. Lower-end PCs can rely on one AMD processor and those needing more power can turn to two AMD chips or Nvidia’s single, high-power chip.
The real question is how the graphics will look on the screen. And, as in most chip releases, the proof will be a while in coming. Nvidia already has HP signed up to use its new chip in a new Voodoo desktop especially for gaming. That makes sense. Nividia’s chip will rock the high-end application, while AMD’s is designed to provide compelling imagery for cheaper, power-efficient PCs and laptops at a large scale. The real battle will be whether AMD’s dual-chip strategy takes business away from Nvidia for specialty graphics computers and high-performance technical computing. If that occurs, Nvidia will have to be on guard: Intel’s planning to follow the same dual-chip path with its Larrabee GPUs.
Nvidia Co-founder and CEO Jen-Hsun Huang thinks mobile is the future, and he’s positioning the graphics chip maker to get a piece of that pie. But media-enamored consumers are keeping the company plenty busy in the meantime. In this interview, Huang talks about mobile, graphics — and whoop-ass. continue reading.
The Tegra chipsets are based on the APX2500 processor built for personal media players and navigation devices, but the Tegra target will be portable computers with screen sizes ranging from 4 to 12 inches. Pay close attention to news coming out of the Computex trade show in Taiwan this week, where more details should emerge from vendors using the Tegra chipset. Products based on Tegra will be out in time for the holiday season at the end of the year and cost about $200 to $250.
These days, thanks to a visually intensive style of computing, a good GPU can improve the user experience much better than a fast CPU. In the data center certain tasks are moving from commodity CPU boxes to GPUs, meaning that over the next year or two, more of them will be sold for corporate computing use.
Nvidia has plans for a mobile chipset that will change the look and functionality of smartphones when it hits in mid-to-late 2009. While many of the big chip vendors are placing bets on the concept of a mobile Internet device that’s larger than a smartphone, but smaller than a laptop, Nvidia’s APX 2500 chips could enable devices that are so sexy, they might render the need for an MID obsolete.
However, I’m told the company will announce an expansion of the APX chips into MIDs soon, so I could be wrong on that last point. Nvidia launched the chips that will make a smartphone function like a PC (or an iPhone) at the Mobile World Congress in February, and I can’t believe I missed it. Read More about Nvidia’s Mobile Play: How Did I Miss This?
The two companies that make the brains found in today’s computers, Intel and AMD, are both pushing hard to get into graphics, just as the top graphics chip maker, Nvidia, is aiming squarely at the CPU space. It’s not an identity crisis so much as a testament to how important graphics have become in the consumer computing experience — and how much money can be made crunching numbers on the corporate side.