AI data centers have become a new battlefield in the global technology race.
Giant companies, competing constantly for the title of “the biggest of all,” have found a new metric to compete on the gigawatts of electricity they consume.
This phenomenon, referred to by experts as “Bragawatts,” raises a giant question mark over the ability of regional and global power grids to handle the dramatic load that artificial intelligence imposes.
Due to electricity costs, one company must currently work hard to fund its central AI campus in Louisiana, USA, which consumes hundreds of megawatts of power.
Another company has even more ambitious plans for a center in Michigan, which may reach over one gigawatt as part of a larger complex, aiming to surpass 8 gigawatts.
Research departments in international banks are trying to collect complex data on AI data center electricity usage.
Estimates suggest that the total planned capacity for AI data centers worldwide already stands at about 46 gigawatts.
This represents an enormous capital investment of about 2.5 trillion dollars to build them.
It can be estimated that their total electricity demand (assuming a standard 1.2 ratio for electricity use efficiency) reaches about 55.2 gigawatts.
For comparison, this amount of electricity could power about 44.2 million homes in the United States, three times the total number of homes in California.
In some countries, preliminary data indicates a need for an additional 500 megawatts (half a gigawatt) in the coming years for AI server farms.
This represents an increase of about 3% in peak electricity consumption, equivalent to the additional demand of about half a million people.
Various authorities warn that the state, in a race to provide the constantly growing electricity demand for its population, is not prepared for AI needs and call for a national strategy to support the efficient establishment of the centers.
AI centers are not merely scaled up versions of traditional cloud data centers.
While normal cloud servers operate at relatively low and variable loads, training AI models requires thousands of graphics processors (GPUs) operating in almost perfect coordination.
The power profile of an AI facility can jump from 30% to 100% within milliseconds and existing power grids were not designed to handle such fluctuations efficiently.
To meet the demand, companies are adopting several strategies.
Some use the existing grid, although this strategy can create a mismatch between electricity supply capabilities and actual demand.
Other companies plan grid expansions and upgrades, investing the necessary capital for electricity supply themselves, using tax incentives for such investments.
Others condition their facility construction on local government participation in the investments.
In the medium and long term, studies show that existing power grids will not withstand the load and the only solution is the construction of dedicated power plants next to data centers to supply them with their electricity.
