One of many lesser-discussed impacts of the AI push is the sheer quantity of vitality required to energy the lots of systematic infrastructure required to energy these expansive methods.
Based on experiences, the training course of for OpenAI’s GPT-4, which is powered by round 25,000 NVIDIA A100 GPUs, required as much as 62,000 megawatt-hours. That’s equal to the vitality wants of 1,000 U.S. households for over 5 years.
And that’s only one challenge. Meta’s new AI supercluster will embody 350,000 NVIDIA H100 GPUs, whereas X and Google, amongst varied others, are additionally constructing large {hardware} tasks to energy their very own fashions.
It’s an enormous useful resource burden, which would require important funding to facilitate.
And it’ll even have an environmental influence.
To offer some perspective on this, the workforce at Visible Capitalist have put collectively an summary of Microsoft’s rising electrical energy wants because it continues to work with OpenAI on its tasks.









