The proliferation of Artificial Intelligence (AI), especially generative AI, is creating a significant surge in global electricity demand. With AI models now serving millions of people day to day, their power consumption is becoming comparable to that of whole countries. For instance, the BLOOM model and GPT-3 model were found to consume 433 MWh and 1287 MWh respectively during training. These figures translate to enough power to supply between 43,000 to 128,700 households in a day. Furthermore, data centers accommodating AI operations demand considerable amounts of resources, consuming over 1% of the world’s electricity and enormous amounts of water.
The International Energy Agency (IEA) warns of the escalating impact of data centers on global electricity consumption, urgently calling for sustainable practices within the AI industry. Predictions suggest that by 2030, data centers in the US and China could consume as much yearly electricity as is produced by 80 to 130 coal power plants.
These high demands strain power grids, with some companies forced to delay the decommissioning of coal-fired plants to meet the power needs of new data centers and electric-vehicle battery factories. Such retention of fossil-based power facilities counters global ambitions for a sustainable, net-zero future.
As such, alternatives to the current energy-consuming AI models are being explored. Neuromorphic AI chips, which mimic synaptic functions, and ‘organoids’ – bio-engineered mini brains – show promise in performing complex AI activities while consuming less power. Discussions on imposing ‘AI tax’ are also being considered, where entities profiting from AI advancements contribute to offset their environmental impacts. Nonetheless, the feasibility and efficacy of these solutions remain uncertain, and the extent to which individuals will bear the burden is unclear. The world is on a precipice, with growing AI power demands clashing with sustainable goals, and it is crucial that viable solutions are urgently finalized.