How Saying “Please” and “Thank You” to AI Could Add Up to Big Bucks – Read
News Update April 21, 2025 10:24 PM

OpenAI CEO Sam Altman recently caused a stir with a surprising revelation on the company’s operational expenses. It appears that people being courteous to ChatGPT is an expensive affair for the firm.

The conversation started in a lighthearted manner on X (formerly Twitter), when a person posted the following question: “How much OpenAI lost on electricity bills because people kept saying ‘please’ and ‘thank you’ to their models?”

Altman’s response was laconic and witty: “Tens of millions of dollars well spent,” he told them, and then added, “You never know.” Although this conversation may appear to be joking at first glance, it is really referring to something quite serious about how AI language models work.

The Power Behind AI Chat

Each time someone uses ChatGPT—to simply say something like “thank you” or “please” or whatever—it is causing the model to create a full response. This is computationally costly, which directly maps to electricity usage.

The reason behind this huge energy usage reduces to the sheer complexity of modern AI systems. According to research by Goldman Sachs, the higher power demands of AI technology are largely the result of “the growing computational intensity of AI workloads, especially those used in generative AI and large language models.”

Credits: NDTV

These advanced models require enormous levels of computing power, storage, and computational capacity to create human-like content in real-time. Hence, analysts forecast data center power usage to increase exponentially in the near future, and AI is going to be responsible for nearly 19% of total data center power usage by 2028.

Altman’s comment soon went viral, and a lot of discussion then followed about the actual cost in real life of our daily interactions with AI systems. The web community responded with their customary mix of technical knowledge and humor.

One of the users humorously proposed a simple solution: “I think it can be easily fixed with client-side code with ‘you’re welcome’ lol.” Another gave a more technical observation: “Attention scores are cached, so it only computes the attention from ‘thank you’ to all previous tokens.”

From Ghibli Dreams to Gigawatt Demands: The Price of AI Popularity

To place that in perspective, the Goldman Sachs report included in these reports states that every question of ChatGPT-4 consumes approximately 2.9 watt-hours of electricity, about ten times the average Google search. Given that OpenAI processes more than a billion requests per day, the firm’s daily energy consumption is approximately 2.9 million kilowatt-hours.

This is coming as ChatGPT is seeing record high popularity. The AI assistant recently reached 150 million weekly active users—a record to date this year in 2025. Usage has been driven in part by viral trends like the making of Ghibli-inspired AI art.

All these expenses notwithstanding, OpenAI continues to advance its AI technology at breakneck speeds. It has recently released two new models: the o3 mode,l that achieved a whopping 69.1% on SWE-bench coding tests, and the o4-mini, which came close to keeping up with a 68.1% score.

The history of ChatGPT’s energy consumption is a fascinating reminder that our tiniest virtual exchanges have physical, real-world consequences. Every polite exchange with an AI assistant requires physical resources—servers humming in data centers, cooling systems working overtime, and electricity coursing through circuits.

So the next time you say “please” or “thank you” to ChatGPT, remember your good manners are literally paid for—though according to Altman, it’s worth the dollars spent on establishing good human-AI relationships.

As the technology for AI continues to grow and increasingly becomes a part of our daily lives, sustainability and resource usage will be of increasing concern to companies like OpenAI and the tech industry at large.

© Copyright @2025 LIDEA. All Rights Reserved.