Energy Consumed by Cloud Computing

457 Words Feb 17th, 2018 2 Pages
It was estimated data centers in the USA consumed 61 billion kWh electricity in 2006. This amount has doubled the consumption in 2000, and was expected to double again by the end of 2011 [1]. Our in depth analysis on empirical characterization generates fundamental insights of server energy usage in the context of server virtualization, including: a significant amount of power is consumed even when the server is idle, thus opening an opportunity for server consolidation in data centers for reducing energy cost. A recent study [2] 10 has indicated that the energy consumption of server is approximately linear to the CPU utilization. This result is applicable in both DVFS enabled server and
DVFS disabled server. The reasons behind this are (1) the modern server is equipped multi-core processor or multi-processor, thus the CPU is the major energy consumed in servers; (2) although CPU can adjust its frequency to lower the power consumption, the number of frequency states is limited, moreover
DVFS is not applied to other server components apart from the CPU. The optimal placement of virtual machine is important for improving power efficiency and resource utilization in a cloud computing environment. Virtualized servers consume more energy than physical ones, for both computing and networking

More about Energy Consumed by Cloud Computing