That’s because the system performs better as more data is gathered, meaning that the 30 percent improvements tally could continue to climb.“Our optimisation boundaries will also be expanded as the technology matures, for even greater reductions,” said DeepMind. It excels in large, complex decision spaces where constraints and payoffs can be modeled, but the only way to create and improve analytical solutions in that space is to interact with it. Google recently put its DeepMind AI in charge of a few data centers, and it ended up saving money on its electricity bills. Using function approximation on weather forecast … In 2016, DeepMind jointly developed an AI-powered recommendation system to improve the energy efficiency of Google’s already highly-optimized data centers. Using a neural network trained on widely available weather forecasts and historical turbine data, we configured the DeepMind system to predict wind power output 36 hours ahead of actual generation.

In any large scale energy-consuming environment, this would be a huge improvement. Since our objective was to improve data centre energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. It is time that AI (with all its Hollywood connotations) be seen in a new light—as a substantial technology that is fast living up to its efficiency potential.For more information on the role of AI-enabled software in energy storage, see the 2Q 2019 Navigant Research report, Opinions expressed by Forbes Contributors are their own.

For example, RL is performing similar feats in power generation. Those actions are sent back to the data centre, where they are verified by the local control system before being implemented.“The idea evolved out of feedback from our data centre operators, who had been using our AI recommendation system,” said a statement from the company.“They told us that although the system had taught them some new best practices – such as spreading the cooling load across more, rather than less, equipment – implementing the recommendations required too much operator effort and supervision. The effort is merely the latest in a series of steps the company has taken toward sustainability in its data centers. Demis Hassabis, the lead at DeepMind who leads around 200 computer scientists and neuroscientists at Google, stated that they are aiming for "solving intelligence, and then using that to solve everything else." But AI’s much-vaunted potential is being realized in the energy sector. Google recently leveraged its DeepMind artificial intelligence (AI) system to save some money on its energy bills. But it can also help us to tackle some of the world’s most challenging physical problems - such as energy consumption. The graph below shows a typical day of testing, including when we turned the machine learning recommendations on, and when we turned them off.Our machine learning system was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling, which equates to a 15 percent reduction in overall PUE overhead after accounting for electrical losses and other non-cooling inefficiencies. RL is one of several AI techniques creating major efficiencies across the value chain, which will grow as software solutions become more sophisticated and widely adopted. - and using it to train an ensemble of deep neural networks. For each potential action – and there are billions – the AI agent calculates its confidence that it will be a Another method is two-layer verification: optimal actions computed by the AI are vetted against an internal list of safety constraints defined by a data centre’s operators. Naturally, they wanted to know whether we could achieve similar energy savings without manual implementation.”Google data centre operator Dan Fuenffinger explained, “We wanted to achieve energy savings with less operator overhead. ... Gmail, YouTube and all of Google's services, it was able to improve their efficiency. Google and DeepMind bring machine learning and better efficiency to wind farms Google states that early tests have increased wind power efficiency by 20% By Dean Pennington on February 27, 2019, 13:54 Large-scale commercial and industrial systems like data centres consume a lot of … Image: Google/Connie Zhou According to DeepMind co-founder Demis Hassabis, DeepMind … With … In some industries, it is certainly living up to it. This will also help other companies who run on Google’s cloud to One of the primary sources of energy use in the data centre environment is cooling. Within the data centers, energy efficiency is calculated with a standard called PUE (Power Usage Effectiveness), which measures the relationship between the energy used by the main equipment and the energy absorbed by the auxiliary equipment.

At DeepMind and Google, we believe that if we can use AI as a tool to discover new knowledge, solutions will be easier to reach. Learn more (opens in a new window)

In March 2016, at the At the time, Kava said the key tenets of Google's culture were "ownership, sustainability, and innovation," also noting that the company had invested $2 billion in renewable energy. Google's DeepMind trains AI to cut its energy bills by 40%.