The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • jsomae@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    6 hours ago

    For reference, this is roughly equivalent to playing a PS5 game for 4 minutes (based on their estimate) to 10 minutes (their upper bound)

    calulation

    source https://www.ecoenergygeek.com/ps5-power-consumption/

    Typical PS5 usage: 200 W

    TV: 27 W - 134 W → call it 60 W

    URI’s estimate: 18 Wh / 260 W → 4 minutes

    URI’s upper bound: 48 Wh / 260 W →10 minutes

    • MangoCats@feddit.it
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      4 hours ago

      I was just thinking, in more affordable electric regions of the US that’s about $5 worth of electricity, per thousand requests. You’d tip a concierge $5 for most answers you get from Chat GPT (if they could provide them…) and the concierge is likely going to use that $5 to buy a gallon and a half of gasoline, which generates a whole lot more CO2 than the nuclear / hydro / solar mixed electrical generation, in reasonably priced electric regions of the US…