The recent explosion of applications of artificial intelligence (AI) has led to many debates. The potential is huge, but the societal threats (from students cheating to machines replacing us at work) are just as significant. However, there’s another potential risk that’s been far less discussed: energy. As it turns out, the new generation of flashy AIs use much more energy than you’d think.
Everything we do online, from generating images to creating a text prompt, relies on information stored on servers, and those machines, stacked together in data centers, require a lot of energy to both run and maintain. Around the planet, data centers account for about 1% of global electricity use. And as we start using AI even more, that number will likely go up.
In a new study, which has not yet been peer-reviewed, a team of researchers from AI developer Hugging Face and Carnegie Mellon University looked at how much power artificial intelligence tools need to do a variety of tasks. They found that generating an image using a powerful AI model requires as much energy as fully recharging the average smartphone.
The researchers looked at the emissions linked with the 10 most popular AI tasks on the Hugging Face platform, such as creating an image or a text. For each task, they ran experiments on 88 different models and measured the energy used with a tool. They also estimated the emissions generated by doing these tasks, using multiple models.
Creating images was found to be the most polluting activity. One thousand images generated with an AI model create emissions comparable to driving 6.5 kilometers in an average car, the researchers found. Meanwhile, creating text was the least polluting activity, responsible for about 0.0009 kms or driving in a similar vehicle.
The team also found that the emissions of large generative models are much higher than those of smaller AI models created for specific tasks. This is because generative AI models try to do many things simultaneously instead of just one task. The researchers told MIT Technology Review that classifying movie reviews with a generative model consumes 30 times more energy than using a smaller model.
A recent study found that the AI industry could consume as much energy as a country the size of the Netherlands by 2027. This is based on some parameters not changing, such as the rate of growth of AI, the availability of chips and servers working at full capacity. AI would consume electricity in the range of 85-134 terawatt-hours (TWh) per year, they researchers found.
Overall, the new findings are a reminder that the AI industry’s carbon footprint will continue to be a big problem ahead, especially as the effects of the climate crisis start to expand. The researchers told MIT Technology Review that they hope their study will lead to people choosing more specialized, less carbon-intensive models whenever possible.
Was this helpful?