Google has released detailed metrics on the energy and water consumption of its Gemini AI assistant, claiming significant efficiency gains.The company estimates a median text prompt uses about 0.24 watt-hours (Wh) of electricity—equivalent to watching TV for less than nine seconds—and emits 0.03 grams of carbon dioxide equivalent (gCO₂e).
Source:
cbsnews.comIt also consumes approximately 0.26 milliliters of water, or five drops, per prompt.
Source:
cbsnews.comThese figures are part of a broader study that highlights Google's efforts to standardize environmental impact measurements for AI models.The company attributes its improvements to advancements in hardware, optimized algorithms, and better data center operations, achieving a 33x reduction in electricity consumption per prompt between May 2024 and May 2025.
Source:
carboncredits.comHowever, experts caution that such metrics may obscure the broader environmental impact of AI scaling."You only see the tip of the iceberg," says Alex de Vries-Gao, who notes location-based carbon emissions metrics—accounting for local energy grid mixes—are often missing from reports.
Source:
theverge.comCritics argue Google's methodology is incomplete.The study relies on market-based carbon emission calculations, which consider renewable energy commitments, but excludes location-based measures that reflect actual grid conditions."This is the groundtruth," says Shaolei Ren, who co-authored a paper cited by Google.Location-based metrics typically show higher emissions and provide more accurate local impact assessments.
Source:
theverge.comAdditionally, Google's water usage estimates are criticized for not accounting for indirect consumption tied to data centers, which Ren’s research shows can reach up to 50 milliliters per prompt.
Source:
theverge.comDespite these concerns, Google maintains its findings represent progress.The company claims its calculations include energy used by idle machines and supporting infrastructure like cooling systems, going beyond previous studies.
Source:
cbsnews.comHowever, the report acknowledges a paradox: efficiency gains may lead to increased total emissions as AI adoption grows."Even though each prompt is cheaper in energy terms, the combined usage continues to increase," notes the study, citing the Jevons paradox.
Source:
carboncredits.comThe broader implications of these metrics are significant.Google’s data centers consumed 30.8 million megawatt-hours of electricity in 2024—more than double 2020 levels—and its total greenhouse gas emissions rose 51% since 2019, with AI as a key driver.
Source:
carboncredits.comWhile the company has invested in nuclear power and clean energy contracts to mitigate this, experts warn that without stricter regulations, the environmental toll could escalate."The precious few numbers we have may shed a tiny sliver of light on where we stand right now," says researcher Luccioni, adding that future AI usage patterns—such as voice-driven agents or 24/7 operations—could drastically increase energy demands.
Source:
technologyreview.comGoogle’s disclosure marks a step toward industry transparency but leaves critical questions unanswered.As AI becomes more integrated into daily life, balancing innovation with sustainability will require addressing both per-query efficiency and systemic energy consumption trends.The company’s report sets a benchmark for measurement standards, but experts stress that true accountability requires comprehensive metrics and regulatory frameworks to curb emissions growth as adoption accelerates.