Thursday, September 13, 2012

Analysis in light of the Pareto Principle

Many businesses who are not getting as much utility out of big data as they would like identify the source of the problem as their inadequate hardware, and inadequate finances. However, in a Smart Data Collective post, Paige Roberts argues that it's not the hardware, but the software that's to blame.
"Investing in better utilization of existing hardware is a far better, more sustainable, and cost-effective solution" for businesses who find their current setups inadequate. Roberts points to the inefficiency built into current "utilization rates of hardware [that] are around 15 percent worldwide." Even the most efficient data centers max out at only 20 percent, meaning that 80 percent is untapped.
Do those numbers ring a bell?

Read more: What's the real problem with the hardware? - FierceBigData