AI's electricity use is spiking so fast it'll soon use as much power as an entire country
AI chatbots like OpenAI's ChatGPT and Google's Bard consume an astronomical amount of electricity and water — or, more precisely, the massive data centers that power them do. In a recent analysis published in the journal Joule, data scientist Alex de Vries at Vrije Universiteit Amsterdam in the Netherlands found that by 2027, these server farms could use anywhere between 85 to 134 terawatt hours of energy per year. That's roughly on par with the annual electricity use of Argentina, the Netherlands, or Sweden, as the New York Times points out, or 0.5 percent of the entire globe's energy demands. Sound familiar? The much-lampooned crypto industry spiked past similar power consumption thresholds in recent years. It's a massive carbon footprint that experts say should force us to reconsider the huge investments being made in the AI space — not to mention the eye-wateringly resource-intensive way that tech giants like OpenAI and Google operate.