Last week, the nuclear power plant invested in by Bill Gates with a $1 billion investment broke ground in Wyoming, USA. On the local time of the 16th, he stated that he would make additional investments: "I have invested over a billion, and I will invest tens of billions more."
The reason why Bill Gates continues to invest in this nuclear power plant is largely due to the immense electrical demand that the development of artificial intelligence (AI) will bring. "The data centers we are going to build will increase the electrical load by up to 10%," he said. The rise of electric vehicles and various home heating devices such as heat pumps has increased the demand for electricity in the United States. "The emergence of data centers is 'adding insult to injury.' Therefore, large technology companies are researching how to contribute to providing more electricity to meet the explosive growth of AI demands."
According to data from the International Energy Agency (IEA), a single query using OpenAI's chatbot ChatGPT consumes 2.9 watt-hours, while a Google search only requires 0.3 watt-hours, which is about 1/10 of the former. By 2026, electricity consumption related to data centers, cryptocurrency, and AI may increase to 620 to 1050 terawatt-hours. In 2023, Germany's total electricity consumption was 465 terawatt-hours, and Japan's total electricity demand was 870 terawatt-hours.
Xiao Fusheng, a partner at EY's Strategy and Transactions Advisory Services, told reporters from First Financial Daily that the large amount of electricity required for AI computing power expansion is a major issue that cannot be avoided at present, but it is not unsolvable. In the future, efforts can be made in terms of technological breakthroughs, resource sharing, and the use of new energy.
Advertisement
Western governments feel the pressure of AI electricity demand.
According to IEA data, if all 9 billion searches on Google per day were transferred to ChatGPT, the annual electricity demand would increase by 10 terawatt-hours, equivalent to the electricity consumption of about 1.5 million EU residents.
With the proliferation of AI infrastructure such as data centers, the electricity demand related to AI will grow rapidly. According to research by investment bank Goldman Sachs, currently, the electricity consumption of global data centers accounts for 1% to 2% of the total electricity consumption, but by 2030, this proportion may rise to 3% to 4%, which is a doubling.
Goldman Sachs' report states that in Europe, by 2030, the electricity demand of the region's data centers will be equivalent to the current total consumption of Portugal, Greece, and the Netherlands. An analysis by the Electric Power Research Institute (EPRI) in the United States found that by 2030, data centers will consume 4.6% to 9.1% of the electricity in the United States.
Earlier this month, the US House of Representatives Energy and Commerce Subcommittee's Energy, Climate, and Grid Security Subcommittee held a hearing on the theme of "Powering AI." The minutes of the meeting showed that the committee believes it is necessary to treat the relevant issues as national security issues. The current US power grid faces significant challenges in meeting the growing demand, and its reliability and stability are worrying. The United States urgently needs strategic energy planning. Intermittent renewable energy alone is not enough to meet the high reliability requirements of data centers and AI technology, so it is necessary to pay attention to a diversified power generation mix, including energy storage and stable dispatchable power generation resources. In addition, the United States needs to invest heavily in the construction and upgrading of energy infrastructure to support the growing demands of AI and data centers.
In Europe, due to concerns that the huge energy consumption of data centers will put too much pressure on national climate goals and the power grid, some countries have started to impose more requirements on the construction of local data centers.In it, Germany passed a new Energy Efficiency Act last October, specifically regulating data centers in terms of electricity usage efficiency, renewable energy supply, and the use of waste heat. For instance, starting from 2024, 50% of the electricity consumption must be supplied by renewable energy sources, and from 2027, this ratio will be increased to 100%.
Last year, one of the world's largest digital infrastructure companies, Equinix, data center operator Vantage Data Centers, and "Edge Connectivity" company EdgeConneX, all had their planning permissions for new data centers in Dublin rejected by Ireland. In fact, since 2022, the Irish Electricity Supply Board (EirGrid) has stated that due to a sharp increase in electricity demand, they will restrict the construction of new data centers, implement stricter approval procedures for new projects, and prioritize those with a smaller impact on the power grid.
Local governments in and around Amsterdam, the capital of the Netherlands, have also suspended the issuance of construction permits for new data centers to alleviate the burden on the power grid and manage energy demand.
Tech Giants: Self-sufficiency in Energy Production
Tech giants like Microsoft, Google, and Amazon are at the forefront of AI development, and their cloud computing business units are supported by a global network of data centers, which also implies significant energy consumption.
Recently, U.S. Secretary of Energy Jennifer Granholm revealed that the Biden administration is in talks with major tech companies, asking them to invest in climate-friendly electricity production to meet their growing demands.
"If tech companies want to draw clean electricity from the grid, they should produce their own power," Granholm told the media, "We have been discussing with data companies. Large data companies have committed to achieving net-zero emissions and want a stable power supply generated by clean energy."
Xiao Fusheng stated that to meet the electricity demand and energy-saving pressures brought by AI, in terms of new energy utilization, clean energy sources such as solar and wind power can be used to power data centers, reducing dependence on fossil fuels and lowering carbon emissions.
Tech giants have also begun to lay out their strategies. According to Amazon's official website, the company has more than 500 solar and wind projects worldwide, with over 100 invested in last year alone, making it the world's largest corporate buyer of renewable energy for the fourth consecutive year. Its investment portfolio is now sufficient to power 7.2 million American households annually. The company states that by 2025, all of its global electricity consumption, including data centers of Amazon Web Services, will be powered by renewable energy.
Microsoft, on the other hand, is betting on nuclear energy. In May last year, the company signed a power purchase agreement with Helion, a U.S. private company focused on developing nuclear fusion technology. In January this year, Microsoft also hired a nuclear technology director responsible for developing atomic reactors to power its data centers. Microsoft's goal is that by 2025, all of its data centers and facilities will be 100% powered by new renewable energy generation that matches their annual electricity consumption.Alphabet, the parent company of Google, announced its 24/7 Carbon-Free Energy (CFE) goal in 2020, aiming to power all of its business operations with carbon-free energy by 2030. Last year, Google opened its first data center in Inzai City, Chiba Prefecture, Japan, and in May of this year, the company announced two new solar power purchase agreements (PPAs) in Japan, supporting the construction of new solar projects and adding 60 megawatts of clean energy capacity to the Japanese power grid.
In addition to tackling the issue at the source of power generation, Xiao Fusheng told Yicai reporters that technology companies can also attempt to share resources needed for AI development. For instance, by maximizing the use of existing graphics processing units (GPUs) through a leasing model, companies can alleviate the shortage of computing power while reducing initial investment costs. If feasible, cloud computing platforms could also be utilized to provide AI computing power, reducing the need for businesses and research institutions to deploy local infrastructure and achieving resource sharing.
Xiao Fusheng believes that achieving technological breakthroughs in AI hardware and software is another direction for energy saving. "Focus on optimizing AI models, improving chip efficiency and algorithm efficiency, as well as advancements in data center hardware and software technology to reduce energy consumption," he said. He gave examples, such as technology companies developing high-energy-ratio chips to enhance computing power and reduce energy consumption; optimizing data centers by modular construction and strengthening transmission capabilities to lower energy use; and improving data transmission efficiency by increasing bandwidth between chips.
In this regard, AI chip developer Nvidia has stated that the training performance of its latest Blackwell GPU is four times that of the previous generation Hopper, with an energy efficiency 25 times greater.
However, Xiao Fusheng indicated that these solutions are not easy to implement. "Technological breakthroughs require substantial support in terms of human resources, materials, and capital. There are challenges to address, such as the commercialization and continuous performance improvement of new chips, the need to ensure the reliability and cost-effectiveness of computing power leasing services, the establishment of effective resource-sharing mechanisms, and the need to overcome the intermittency and instability of renewable energy sources. Additionally, there is a requirement to achieve intelligence and efficiency in energy management, while also considering compatibility and transition issues with existing energy infrastructure." But he believes that "the development of new technologies always requires a process."
Post a comment