TRENDING TAGS :
Your ‘Please’ and ‘Thank You’ Is Costing Millions!
It’s about building the infrastructure and energy systems that can support a future where conversations with machines don’t come with an ecological price tag.
New Delhi. Quite a many people who use AI chatbots express courtesy like ‘Please’ and ‘Thank you’ in their prompts, but the fact is that these courties are proving heavily expensive for the AI companies.
In a revelation that’s both amusing and alarming, OpenAI CEO Sam Altman has disclosed that simple expressions of courtesy, like saying "please" and "thank you" to ChatGPT, are quietly contributing to the company’s multi-million-dollar operational costs. The comment came in response to a user query on X (formerly Twitter), who half-jokingly wondered how much money OpenAI has spent on users being polite. Altman’s quickly responded - “Tens of millions of dollars well spent. You never know.”
When Politeness Costs Real Money
Although polite phrases may seem trivial in a human conversation, for AI, they add computational complexity. Every extra word requires the AI model to interpret and generate contextually appropriate responses, thereby increasing the processing load. This, in turn, ramps up energy consumption in the data centers that power the technology. The cost isn’t just financial. It’s environmental. According to a report by Goldman Sachs, each ChatGPT-4 query consumes approximately 2.9 watt-hours of electricity, roughly 10 times that of a typical Google search. With over a billion queries handled daily, OpenAI’s daily energy usage balloons to 2.9 million kilowatt-hours. That’s enough to power over 100,000 Indian homes for a day.
The energy demands of AI are quickly emerging as a major global concern. A study by the Electric Power Research Institute (EPRI) projects that AI data centers could consume up to 9.1% of U.S. electricity by 2030. Meanwhile, the International Energy Agency (IEA) predicts that data centers will drive more than 20% of the growth in electricity demand across advanced economies in the coming years. Already, these facilities account for approximately 2% of global electricity consumption, a figure likely to climb as AI becomes more deeply embedded in everyday life, from personalized assistants and healthcare diagnostics to creative writing and coding.
Interestingly, a 2024 survey found that 67% of U.S. users are polite to their chatbots, with 55% doing so because “it’s the right thing to do.” About 12% admitted their politeness stems from concern about an AI uprising, a scenario that remains firmly in the realm of science fiction, according to most researchers.
While the idea of being nice to a machine might sound quaint, the environmental reality is stark. One research found that generating a 100-word AI email consumes about 0.14 kWh of electricity. Multiply that by the tens of thousands of long-form prompts generated every day, and the cumulative impact becomes staggering. So, remember, if you sent one AI-written email per week for a year, you'd consume 7.5 kWh electricity.
As artificial intelligence becomes more central to how we live, work, and communicate, even our most innocent digital habits carry hidden costs. That simple “thank you” to your chatbot might feel like good mannersbut it’s also a tiny hit to the planet. In the grand scheme, the issue isn’t about cutting back on courtesy. It’s about building the infrastructure and energy systems that can support a future where conversations with machines don’t come with an ecological price tag.


