With all the hype about cloud computing, it’s easy to label it as the latest fad, especially when everyone whose application talks Internet is trying to rebrand themselves as a cloud. But the long view shows that this really is an important change, one of several major shifts in computing that have taken place over the last 40 years, each of them driven by costs and shortages.
Once upon a time, computing was expensive. As a result, programmers carried their stacks of punched cards into basements late at night, and ran them on the mainframe. The CPU was always busy; humans were cheap.