AT&T’s Cloud Offering Is Foggy

Telecom giant AT&T announced its version of a cloud computing service today, called Synaptic Hosting, but to make things horribly unclear (and perhaps keep enterprise customers happy) it decided it should call the effort everything from utility computing to a hosting solution. I’m not sure if the entire service counts as a cloud, but AT&T does say that it will “support large-scale computing and applications on demand via virtualized servers and deliver services across AT&T’s Internet Data Center hosting infrastructure.”

So it does seem that despite the continued use of the words hosting and utility computing peppered throughout the announcement, that somewhere there is a cloud. My guess is there are a whole range of services being offered here, all with an AT&T service-level agreement. That could interest cloud-leery enterprise customers.

The key advantage to AT&T’s service is that it controls not just the servers and the cloud, but it also owns the network that those bits of data must traverse to get from the cloud to your computer. That’s a powerful proposition because it gives AT&T one more potential point of failure that it can guarantee and control. It also could lead the way for some interesting pricing options given that AT&T will know exactly how much it costs for each byte of storage and each compute cycle, but it also has the wholesale costs of bandwidth.

Want to define the cloud? Check out these posts for some help:

The Myth of No Software

The debate around cloud computing and software-as-a-service (SaaS) has energized industry conversations on the future of software. But in fact what we are witnessing in the software industry today is not a revolution, but an evolution. Customers are most concerned with how to use software to sustain competitive advantage, align IT with the business and deliver the best experience for users without compromise — regardless of delivery option — whether that is SaaS, on-premise software or a combination of the two. That’s why this evolution of software in a services world is so important for the industry to broadly support, and why customers deserve more than all-or-nothing ultimatums. For more, see Refresh the Net.

Other infrastructure-themed stories that may be of interest:

The Long Tail of IT
Subscription Services: The Future of Our Entire Economy
Architecting for Failure
Five Nines is Still Not Enough
Do You Know What Kind of Cloud You’re Using?
Defogging Cloud Computing: A Taxonomy
The Craft: Automation and Scaling Infrastructure
Is Infrastructure the New Marketing Medium?
Achieving Equality is Critical to the Future of the Internet
Why Google Needs its Own Nuclear Plant
The Geography of Internet Infrastructure

Could Climate Change Lead to Computing Change?

I wrote about an effort us use millions of specialized embedded processors to build an energy-efficient (relatively) supercomputer that could run at speeds of up to 200 petaflops over at Earth2Tech. The Department of Energy’s Lawrence Berkeley National Laboratory has signed a partnership with chip maker Tensilica to research building such a computer, but after chatting with Chris Rowan, Tensilica’s CEO, I wonder if more specialized computing tasks in the data center might be farmed out to highly customizable — but lower-powered — chips.

Read More about Could Climate Change Lead to Computing Change?

Samsung Says Thin Is In

The memory business is a volatile one, driven by consumer demand for products like MP3 players and rapid obsolescence. That’s why the gradual move of solid-state storage drives based on NAND flash memory into the PC is so interesting. Now that those drives are bigger, at 64GB and soon 128GB, memory makers can flatten out some of the volatility seen in the consumer market by putting them into corporate laptops where demand is less influenced by economic cycles.

Most solid-state memory for PCs ends up in rugged or sexy high-end laptops such as the new MacBook Air, which is offered with either an 80GB hard drive or a 64GB SSD, and the Lenovo x300, which comes with a 64GB solid-state drive made by Samsung. The lack of moving parts makes a solid-state drive much more durable for rugged machines and the smaller size of flash drives means they can allow for thinner, lighter laptops.

In addition to revealing that its solid-state drive was in the x300, Samsung has unveiled a traditional 500GB hard drive that contains three disks crammed into a 9.5 mm-high drive. Andy Higginbotham (no relation), director of hard drive sales and marketing at Samsung, says this gives Samsung a leg upon density as the competition can only fit two disks in that space.

And if a user pops two of these in a notebook, he added, suddenly they’re walking around with a terabyte of storage (that could store 120 hours of HD video or 320,000 images). In a laptop. Think about how much confidential data someone could store on it, only to have stolen out of their car. It boggles the mind.