The development of artificial intelligence has driven huge electricity demand, with the world competing to buy Chinese transformers.

robot
Abstract generation in progress

Mars Financial News, April 13 — Recently, in many parts of the country, transformer manufacturing companies have been operating at full capacity, with overseas orders continuously overflowing, and some companies’ production schedules have been booked until 2027. Data shows that the power consumption load of a super-large AI data center has exceeded 1 gigawatt (1 gigawatt equals 1 billion watts), equivalent to the peak summer electricity demand of a medium-sized city. Additionally, AI large models are accelerating from the “training” phase toward the “inference” phase, which means their electricity consumption is shifting from one-time investment to continuous consumption. Professor Ding Zhaohui from the School of Electrical and Electronic Engineering at North China Electric Power University explained, “In the past, training a model was a one-time process. Now, various industries are using large models, so the electricity consumption naturally increases. The energy consumption during the ‘inference’ phase of large models is becoming more and more significant, and the demand for electricity from AI data centers is also growing.” (Economic Daily)

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin