Huawei has become an official partner of CERN openlab, with the physics research facility giving the thumbs-up to the Chinese firm’s exascale-targeting, mass object-based storage infrastructure.
An effort to build a telescope that can see back 13 billion years to the creation of the universe is prompting a five year €32 million ($42.7 million) effort to create a low-power supercomputer and networks to handle the data the new telescope will generate.
We are moving from the Information Age to the Insight Age, and as part of that shift we need a compute architecture that will handle the storage and processing required all without requiring a power plant hooked up to every data center. What architecture will win?
The brains inside your smartphone are getting more power with the latest version of application processors having two processing cores to help speed up the delivery of web site load times and mobile gameplay. That’s awesome, but startup Adapteva, wants to take that number higher.
President Obama’s budget is asking for $126 million for the Department of Energy to reach a supercomputing milestone — exascale performance. Research paid for by these millions could create more power-efficient silicon and networking technologies that will benefit information technology in general. Plus we’d get faster supercomputers.
Supercomputer experts, including the chief information officer of NASA’s Ames Research Center and a computer strategist for the U.S. Army’s research and development center, said that scientists are still working towards developing an “exascale” computer — one that can do a million trillion calculations per second.