This article is
- freely available
Energy Saving in Data Centers
Center for Information Technology and Algorithms, Department of Computer Science, University of Nevada, Las Vegas, NV 89154, USA
Received: 29 December 2017 / Accepted: 4 January 2018 / Published: 9 January 2018
Globally CO2 emissions attributable to Information Technology are on par with those resulting from aviation. Recent growth in cloud service demand has elevated energy efficiency of data centers to a critical area within green computing. Cloud computing represents a backbone of IT services and recently there has been an increase in high-definition multimedia delivery, which has placed new burdens on energy resources. Hardware innovations together with energy-efficient techniques and algorithms are key to controlling power usage in an ever-expanding IT landscape. This special issue contains a number of contributions that show that data center energy efficiency should be addressed from diverse vantage points.
sustainability; data centers; cloud computing; renewable energy; energy efficiency; green computing
Sustainability, as defined by the National Science Board (NSB), involves meeting present needs without compromising the ability of future generations to meet their own needs. According to the NSB sustainability cuts across disciplinary boundaries, because it has many aspects that cannot be addressed by individual disciplines [1
]. With the imperative to limit carbon emissions, energy efficiency has taken on renewed urgency. The chief executive officer of Google remarked, “What matters most to the computer designers at Google is not speed but power, low power, because data centers can consume as much energy as a city” [2
]. Cloud computing now represents a backbone of IT services and recently there has been an increase in high-definition multimedia delivery, which has placed new burdens on energy resources. Energy demands have motivated research on energy efficiency strategies and the implementation of renewable energy sources as to reduce operational costs and environmental impact. The ever-increasing scale of data centers has presented unprecedented challenges for hardware and algorithmic development.
This special issue aimed to present a wide range of papers with contributions in the area of green cloud computing. Contributions bringing together hardware challenges and algorithmic approaches were especially encouraged and we were interested in submissions from researchers who work at the intersection of Electrical Engineering, Modeling and Simulation, and Computer Science.
2. Putting Energy Usage of Data Centers in Perspective
According to the 2016 United States Data Center Energy Usage Report [3
] (published by U.S. Department of Energy under a Lawrence Berkeley National Laboratory Contract), data centers in the U.S. consumed an estimated 70 billion kWh in 2014—the year studied in the report. This represents about 1.8% of total U.S. electricity consumption. Globally CO2
emissions attributable to IT are on par with those of aviation with 70.9 MMTCO2
in 2011 [4
]. It is noteworthy that energy efficiency measures have already stemmed energy demand increases despite the tremendous growth in big data: data centers electricity consumption increased by about 4% from 2010 to 2014, less than the roughly 24% percent increase in the previous five years and much less than 90% five-year increase prior to that. Indeed, current growth is mainly attributable to very large “hyper-scale” datacenters used for large cloud facilities. Energy-efficient techniques and algorithms are key to controlling power usage in an ever-expanding IT landscape.
Efficiency gains have been achieved across the gamut from improved hardware, server power scaling, refined operating systems, efficient use of virtualization, geographical workload distribution, advanced resource management, and the preferred use of renewable energy. Areas of interest are data center sustainability, dependability and cost, energy reduction in server systems, energy-aware resource allocation, energy-optimized multimedia clouds, thermal states and cooling system operation, power-down management, burdened energy management, dynamic voltage and frequency scaling, dynamic component deactivation, load balancing power for cluster-based systems, power management in virtual cloud environments, geographic cloud server location design, case studies simulations for data center energy problems, approximation algorithms and heuristics for data center energy problems, and the integration of renewable energy to cloud computing.
3. Contributions to the Special Issue
Worldwide there is a push to get away from traditional fossil fuel and nuclear energy sources in favor of renewables. However, electrical energy produced by both wind and solar power generation is highly weather-dependent and weather cannot be controlled. In their contribution “Scheduling Energy Efficient Data Centers Using Renewable Energy,” Iturriaga et al. [6
] describe a multi-objective approach for scheduling energy consumption in data centers considering traditional and green energy data sources. Two multi-objective evolutionary algorithms combined with a greedy heuristic are proposed and are enhanced by applying simulated annealing for post-hoc optimization.
In “Virtual Machine Replication on Achieving Energy-Efficiency in a Cloud,” Mondal et al. [7
] take on the issue of virtualization, as growth in cloud service demand has led to the establishment of large-scale virtualized data centers in which virtual machines are used to handle user requests for service. To mitigate failures, replication techniques are used. The trade-offs between job completion time and energy consumption in different replication schemes are characterized through comprehensive analytical models, which capture virtual machine state transitions and associated power consumption patterns.
Ho et al. [8
] address the issue of energy consumption from an application-level perspective in their contribution “Characterizing Energy per Job in Cloud Applications.” They propose analytical models to assess energy consumption and derive the energy per job for different configurations in data centers. The focus is on evaluating the efficiency of applications in terms of performance and energy consumed per job, in particular when shared resources are used and the hosts on which the virtual machines are running are heterogeneous in terms of energy profiles, with the aim of identifying the best combinations in the use of resources.
The article “Energy Aware Pricing in a Three-Tiered Cloud Service Market,” by Paul et al. [9
] discusses energy-efficient pricing strategies in the cloud service market and describes a novel theoretical framework for implementing sustainable pricing policies and the corresponding optimal resource provisioning policies. They give performance evaluation with real data sets on electricity price, renewable generation, workload service request, and operational details of the data centers.
The articles selected for this special issue show that data center energy efficiency has to be addressed from various vantage points and that models need to be drawn from diverse areas. Given the continued impact of cloud computing globally, the topics discussed in this special issue will only gain in importance.