One of the drivers for virtualization has always been the desire to use resources more efficiently by increasing the utilization of all those servers sat in the data center. And yet, utilization rates remain lower than might be expected. In a conversation with Abiquo CEO Pete Malcolm this week, he cited figures to suggest that “ideal” utilization of storage is about 70%. In non-virtualized environments, actual utilization is around 25%, but even in virtualized environments that typically only rises to about 35%. The figures for servers are worse. The ideal is again about 70%. Non-virtualized usage is typically only 10%, rising to around 27.5% with virtualization. Are IT managers too cautious to push those utilization rates higher, or is something else going on?