Data Centers May Hit Size Limits
It sure is fun asking ChatGPT to do something, like “write a request for proposals to economic development agencies for the location of a new office tower in a downtown area.” What we don’t think about, however, is the physical infrastructure --- data centers -- that is needed to get that answer – and the U.S. may be facing a cap on the amount of space that we have.
“The data centers that make generative AI products like ChatGPT possible will soon reach size limits,” according to Microsoft Azure Chief Technology Officer Mark Russinovich, “necessitating a new method of connecting multiple data centers together for future generations of the technology,” according to Semafor.
The challenge is that AI models need to be trained inside a single building where the processors, such as Nvidia’s H100s, can be connected so they act as one computer.
“As Microsoft and its rivals compete to build the world’s most powerful AI models, several factors, including America’s aging energy grid, will create a de facto cap on the size of a single data center, which soon could consume multiple gigawatts of power, equivalent to hundreds of thousands of homes.”
What is the solution? The article reported that “while connecting two data centers is challenging, some people believe training an AI model might one day be possible with smaller computers spread out all over the world."