While the cost of computing power is decreasing every year, the cost of energy is not. In the future, it is expected to account for a significant share of the cost of owning and running a computer.

The Digital Power Group estimates that the world’s ICT ecosystem uses around 1 500 TWh of electricity, which is equal to the entire electric generation of Japan and Germany combined – and as much electricity as was used for global illumination in 19851. Coal is the world’s largest source of electricity.

Is the cloud the answer? The energy costs of transferring data can be high, while the installation and electricity costs of running data centres can be steep. The power used by PCs, tablets and smartphones to download data from the cloud must also be factored in.

But by optimising the use of both equipment and cooling (which eats up a significant share of the energy consumed – from data centre air conditioning to a PC’s fan), the cloud can help reduce the environmental impact of computing. And the advantages of pooling are growing: technological developments mean that one server is now able to carry out multiple workloads simultaneously.

  1. The Cloud Begins with Coal, http://www.tech-pundit.com/wp-content/uploads/2013/07/Cloud_Begins_With_Coal.pdf?c761ac&c761ac


(Article from net-cloud future magazine (2013) - for complete magazine click here)