Altman was responding to a submit on X (previously Twitter), which learn, “I’m wondering how a lot cash OpenAI has misplaced in electricity costs from folks saying ‘please’ and ‘thanks’ to their fashions.”
Altman replied, “Tens of hundreds of thousands of {dollars} effectively spent—you by no means know.”
When an organization trains an AI mannequin to do issues like acknowledge pictures or perceive language, it makes use of huge datasets and highly effective {hardware} like GPUs (graphics processing models), TPUs (tensor processing models), or different high-performance chips. This course of consumes a considerable amount of electrical energy, coaching giant fashions like GPT can require tons of of megawatt-hours.
AI hardware additionally generates vital warmth throughout coaching. Cooling programs, like air-con or liquid cooling, can devour almost as a lot electrical energy because the computing itself.
For context, coaching GPT-3 reportedly used about 1,287 megawatt-hours (MWh) of electrical energy, roughly sufficient to energy 120 US houses for a yr.
Uncover the tales of your curiosity
Earlier this month, OpenAI released its latest reasoning models, o3 and o4-mini. These new “o-series” fashions are described as essentially the most superior but. They will reply questions utilizing all of ChatGPT’s instruments, internet shopping, Python coding, and picture evaluation. With the addition of customized person instruments, OpenAI is inching nearer to its purpose of enabling ChatGPT to finish duties independently.