• Altman rejects viral water claims but...

    From Mike Powell@1:2320/105 to All on Mon Feb 23 13:05:38 2026
    Sam Altman says ChatGPT water use claims are 'completely untrue' - but admits AI energy use is a concern

    By Graham Barlow published 1 hour ago

    Altman rejects viral water claims - but admits AI's energy footprint is
    only getting bigger

    Sam Altman dismisses claims about ChatGPT's water usage as "totally
    fake"
    Experts warn that scaling AI infrastructure is driving huge costs and increasing pressure on power, cooling, and resources
    The real issue isn't efficiency - it's whether AI can grow at this
    scale without serious environmental impact

    Speaking at an event hosted by The Indian Express, OpenAI CEO Sam Altman dismissed claims that AI's water usage is high as "totally fake", but he
    did acknowledge that it had been an issue in the past when "we used to do evaporative cooling in data centers."

    "Now that we don't do that, you see these things on the internet like,
    `Don't use ChatGPT, it's 17 gallons of water for each query' or
    whatever," Altman said. "This is completely untrue, totally insane, no connection to reality."

    You can find this segment at around 27 minutes in the video of the event:

    Sam Altman Unfiltered: ChatGPT, AI Risks & What's Coming Next, 40 Questions
    in 60 Minutes:
    https://youtu.be/qH7thwrCluM

    Altman did concede that concerns around AI's overall energy consumption are "fair", noting that "the world is now using so much AI" and that "we
    need to move towards nuclear or wind and solar very quickly".

    AI-specific data centers already leave a larger and more complex footprint than traditional facilities, and several groups have raised concerns about their environmental impact - particularly around rising electricity demand, water usage, and the construction of new infrastructure. That build-out is also having knock-on effects, including increased demand for components like RAM, which is pushing up prices across the industry.

    IBM CEO Arvind Krishna has previously raised doubts about whether the current pace and scale of AI data center expansion is financially sustainable. He estimates that equipping a single 1GW site with compute hardware now costs close to $80 billion - and with plans for nearly 100GW of capacity dedicated
    to advanced AI training, the total potential spend could approach a staggering $8 trillion.

    Meanwhile, AI's new wave of ultra-powerful accelerators is pushing data
    centers breaking point, forcing a rethink of power, cooling, and connectivity. Hardware that felt cutting-edge just a few years ago can't keep up, as modern AI workloads demand a complete overhaul of everything from rack design to thermal strategy.

    As well as dismissing claims about ChatGPT's water usage, Altman also offered
    a more unusual defense of OpenAI's overall energy use. He argued that discussions around AI's energy consumption were "unfair" because they
    don't account for how much energy it takes to train humans to perform similar tasks.

    "It also takes a lot of energy to train a human."
    Sam Altman, CEO OpenAI

    "But it also takes a lot of energy to train a human," Altman said. "It
    takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce
    you."

    He continued: "If you ask ChatGPT a question, how much energy does it take
    once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way."

    I can see the argument Altman is making - that human intelligence also comes with an energy cost - but it feels reductive, and faintly cynical, to reduce the value of a human life to its energy consumption. More importantly, it sidesteps the real issue. The question isn't whether humans also use energy
    (of course they do!) but whether scaling AI to billions of daily queries introduces entirely new levels of demand that we haven't had to account for before. Comparing the lifetime energy cost of a human to the marginal cost of an AI response might be provocative, but it's not especially useful.

    What Altman's comments highlight is a growing tension at the heart of the AI boom. The technology may be getting smarter and more efficient, but the scale at which it's being deployed is growing even faster, raising fresh concerns about its long-term environmental impact, including pressure on global water supplies. The UN has already warned that the world has entered an "era of global water bankruptcy," underlining just how fragile those resources have become.

    Those questions aren't going away. As AI adoption accelerates, the real challenge won't just be how efficient the technology becomes, but whether it can scale sustainably at all.


    https://www.techradar.com/ai-platforms-assistants/sam-altman-says-chatgpt-water -use-claims-are-completely-untrue-but-admits-ai-energy-use-is-a-concern

    $$
    --- SBBSecho 3.28-Linux
    * Origin: Capitol City Online (1:2320/105)
  • From Kurt Weiske@1:218/700 to Mike Powell on Tue Feb 24 07:16:19 2026
    Mike Powell wrote to All <=-

    "Now that we don't do that, you see these things on the internet like, `Don't use ChatGPT, it's 17 gallons of water for each query' or
    whatever," Altman said. "This is completely untrue, totally insane, no connection to reality."

    Is he complaining about the statement or the accuracy of the number of
    gallons used to cool their infrastructure? It would be more comforting
    if he were able to say that 100% of their data centers use closed-loop
    cooling systems - but he didn't.

    As well as dismissing claims about ChatGPT's water usage, Altman also offered a more unusual defense of OpenAI's overall energy use. He
    argued that discussions around AI's energy consumption were "unfair" because they don't account for how much energy it takes to train humans
    to perform similar tasks.

    "It also takes a lot of energy to train a human."
    Sam Altman, CEO OpenAI

    "But it also takes a lot of energy to train a human," Altman said. "It takes like 20 years of life and all of the food you eat during that
    time before you get smart. And not only that, it took the very
    widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you."


    Just stay in your pod and feed the matrix, it'll be OK.

    He continued: "If you ask ChatGPT a question, how much energy does it
    take once its model is trained to answer that question versus a human?
    And probably, AI has already caught up on an energy efficiency basis, measured that way."

    I can see the argument Altman is making - that human intelligence also comes with an energy cost - but it feels reductive, and faintly
    cynical, to reduce the value of a human life to its energy consumption.

    Or equating the value of an AI against that of a sentient, human being.

    We're all going to die.



    --- MultiMail/Win v0.52
    * Origin: http://realitycheckbbs.org | tomorrow's retro tech (1:218/700)
  • From Mike Powell@1:2320/105 to Kurt Weiske on Tue Feb 24 11:10:14 2026
    Is he complaining about the statement or the accuracy of the number of
    gallons used to cool their infrastructure? It would be more comforting
    if he were able to say that 100% of their data centers use closed-loop
    cooling systems - but he didn't.

    I would like to think the latter but I also got the impression he meant the former, i.e. he doesn't like the statement.

    Just stay in your pod and feed the matrix, it'll be OK.

    Sort of sounds like it.

    Or equating the value of an AI against that of a sentient, human being.

    We're all going to die.

    Unfortunately, that could be sooner than some of us expected, at this rate.

    $$
    --- SBBSecho 3.28-Linux
    * Origin: Capitol City Online (1:2320/105)