Google Leverages its AI Tools to Slash Data Center Energy Consumption
Google, along with other tech companies, has been busy open sourcing its own machine learning and artificial intelligence tools. Now, the company has shown that one of its own AI tools is so powerful that it suggests new business models for saving costs in data centers.
Google has been using machine learning for its data centers for several years, ming for ways to reduce costs. And, of course, data centers are a huge cost center at Google. Recently, some of the more specialized AI tools from DeepMind tackled the problem of coolingserver farms. Now, Google has announced that newly identified solutions have cut the energy needed for cooling by a whopping 40 percent.
Data centers are huge electricity consumers, and anything that can reduce their appetites for energy represents a possible powerful new business model. "We are already talking to non-Google partners about using the same algorithm," Google officials told BBC News.
"We are planning to roll out this system more broadly and will share how we did it in an upcoming publication, so that other data centre and industrial system operators -- and ultimately the environment -- can benefit from this major step forward," noted Google engineers. A whitepaper is reportedly to come from them within four to six weeks.
According to Rich Evans, Research Engineer, DeepMind and Jim Gao, Data Centre Engineer, Google:
"We are excited to share that by applying DeepMind’s machine learning to our own Google data centres, we’ve managed to reduce the amount of energy we use for cooling by up to 40 percent. In any large scale energy-consuming environment, this would be a huge improvement. Given how sophisticated Google’s data centres are already, it’s a phenomenal step forward.
The implications are significant for Google’s data centres, given its potential to greatly improve energy efficiency and reduce emissions overall. This will also help other companies who run on Google’s cloud to improve their own energy efficiency. While Google is only one of many data centre operators in the world, many are not powered by renewable energy as we are. Every improvement in data centre efficiency reduces total emissions into our environment and with technology like DeepMind’s, we can use machine learning to consume less energy and help address one of the biggest challenges of all -- climate change.
Using a system of neural networks trained on different operating scenarios and parameters within our data centres, we created a more efficient and adaptive framework to understand data centre dynamics and optimize efficiency.
We accomplished this by taking the historical data that had already been collected by thousands of sensors within the data centre -- data such as temperatures, power, pump speeds, setpoints, etc. -- and using it to train an ensemble of deep neural networks.
Since our objective was to improve data centre energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data centre over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints."