AI sucks electricity from the grid. Experts warn against a worst-case scenario

Twilight of garage innovation.  "You have to be ready to face the giants"

Creating AI already consumes as much electricity as a small country, energy industry specialists warn. The huge increase in interest and demand for artificial intelligence may be the beginning of bigger problems.

AI tools are sprouting like mushrooms after rain, and with them the power consumption for artificial intelligence is growing inexorably. The energy situation may already be alarming, and in just a few years it may deteriorate significantly.

AI consumes as much energy as a small country

The latest study was conducted by the French energy company Schneider Electric. According to specialist reports, at the moment, work on artificial intelligence consumes an estimated 4.3 GW of global energy. Some small countries achieve this result.

However, this is just the beginning, because this technology is rapidly gaining popularity. As demand increases, so does the demand for electricity. French experts estimate that by 2028 AI’s electricity demand will increase to 13.5-20 GW. This is a compound annual growth of 26-36 percent.

The change that will occur as the market develops is also crucial. The two basic tasks in creating AI are model training and AI inference. Currently, the energy consumption split between both is 20:80 respectively.

Huge amounts of electricity are used during training, because artificial intelligence is trained by serving it millions of data samples, which are then processed by the so-called accelerators (e.g. graphics cards). Training may take from a few hours to even a few months, but it is largely a finished process.

AI inference takes place every time a user uses AI, e.g. asks something in a chat or asks for an image to be generated. It is a less energy-consuming process, but it is continuous and will continue to grow as these tools become more popular over the years. Already in 2028, the above-mentioned division may change to 15:85 in favor of inference.

AI is a big problem for data centers

Currently, tasks related to artificial intelligence constitute approximately 8%. power consumption in an average data center. In total, this consumes as much as 54 GW of energy. In just over four years, however, the demand of a single facility of this type will increase to as much as 90 GW, of which 15-20 percent support for artificial intelligence systems will consume.

Another issue is data center cooling. Temperature requirements further increase electricity consumption and also use up natural resources. It is mainly water, which is more effective than classic methods. Large AI clusters simply cannot be cooled with air – they would be too hot.

Data center companies are already criticized for their high consumption of natural resources. The development of AI may quickly deepen today’s problems, warns Schneider Electric. To get out of the slowly tightening loop, specialists point to two key areas. This involves a significant modernization of the current infrastructure and a general improvement in the operational efficiency of the centers.

Similar Posts