The creation of AI already consumes as much electricity as a small country, energy industry experts warn. The huge increase in interest and demand for artificial intelligence could be the start of bigger problems.
AI tools are growing like mushrooms after rain, and along with them, the power consumption of artificial intelligence is inexorably growing. The energy situation may already be alarming, but in just a few years it may worsen significantly.
AI consumes as much energy as a small country
The latest study was conducted by the French energy company Schneider Electric. Experts estimate that work on artificial intelligence currently consumes approximately 4.3 GW of global energy. Some small countries achieve this result.
However, this is only the beginning, because this technology is rapidly gaining popularity. As demand increases, so does the demand for electricity. According to French experts, by 2028, AI’s demand for electricity will increase to 13.5-20 GW. This is a compound annual growth rate of 26-36 percent.
The changes that will occur as the market evolves are also critical. The two main tasks in creating AI are model training and AI inference. Currently, the power consumption ratio between them is 20:80 respectively.
During training, a huge amount of electricity is used as the artificial intelligence learns by feeding it millions of data samples, which are then processed by so-called accelerators (for example, video cards). Training can take anywhere from a few hours to even a few months, but in many ways it is a complete process.
AI output occurs every time a user uses the AI, such as asking something in a chat or asking to create an image. This is a less energy intensive process, but it is ongoing and will continue to grow as these tools become more popular over the years. Already in 2028, the above division may change to 15:85 in favor of the inference.
AI is a big challenge for data centers
Currently, tasks related to artificial intelligence account for approximately 8%. energy consumption in an average data center. In total, this consumes a whopping 54 GW of energy. However, in just four years, the need for one installation of this type will grow to 90 GW, of which 15-20 percent will be consumed by support for artificial intelligence systems.
Another problem is data center cooling. Temperature requirements further increase energy consumption and also deplete natural resources. Basically it is water, which is more effective than classical methods. Large AI clusters simply cannot be cooled with air - they would be too hot.
Data center companies are already criticized for their high consumption of natural resources. Advances in artificial intelligence could quickly worsen today’s problems, warns Schneider Electric. To get out of the slowly tightening loop, experts point to two key directions. This involves significant modernization of the existing infrastructure and an overall increase in the efficiency of the centers.
Source: Wprost
I am George Brown, author at Daily News Hack. I mostly cover economy news and I have been doing this for quite some time now. I have a lot of experience in this field and I’m always looking for new opportunities to learn more.

