
As AI integration soars on a global level, the amount of energy consumed by large language models is also exploding. [some emphasis, links added]
Everyone wants to be at the forefront of the AI revolution, but there are far fewer volunteers for figuring out how to source all that energy and source it sustainably – and even fewer volunteers to foot the bill for its skyrocketing resource needs.
In fact, powering data centers has become such a tricky business that some AI moguls have been turning to illegal channels to keep the electricity flowing.
Last week, the Environmental Protection Agency (EPA) ruled that Elon Musk’s artificial intelligence company xAI has been illegally powering its massive Tennessee data centers through the use of a fleet of methane gas turbines.
The turbines were operating without required air quality permits, but the company has been fighting the charges for a year and a half, claiming that their operations were exempt.
In fact, the company had identified and taken advantage of an existing loophole in the local county’s provisions, which allowed them to operate these generators without a permit as long as they moved the portable turbines once a year.
However, the EPA revised these rules to fix the legal ambiguity and hold xAI accountable for its considerable emissions.
“At full capacity, xAI’s Colossus 1 data center uses 150 megawatts of electricity – enough power to run 100,000 homes – and the company plans to expand,” states a recent report from The Guardian.
The firm is now operating with 12 permitted turbines, but at one point, xAI was operating with 35 unpermitted mobile generators. The ruling has been celebrated in Memphis, where community activists have been fighting the turbines for well over a year.
“Our communities, air, water, and land are not playgrounds for billionaires chasing another buck,” said Abre’ Conner, NAACP director of environmental and climate justice. The NAACP is responsible for a July lawsuit against xAI that helped bring the violations of the Clean Air Act to the attention of the EPA.
While this win is a great achievement for Memphis communities, a much larger problem persists.
The scale of large language models’ electricity needs is mammoth and growing all the time, and finding the financial, physical, and energy resources to power them is going to continue to be a problem.
In many ways, the artificial intelligence sector is the wild west – regulators are tripping over their own feet trying to keep up with the spread and advance of the tech sector, and xAI is likely not the only company searching out loopholes and operating in legal and ethical gray areas.
Moreover, all that energy is really, really expensive, and Big Tech is doing everything it can to avoid footing the bill.
In California, Silicon Valley recently managed to block new rules for data centers that would increase regulations and protect regular constituents from carrying the financial burden of their growing energy use.
Instead of introducing forward-thinking but overdue legislation to control data centers’ impact on Californians, influential Big Tech representatives helped water the issue down to a law requiring regulators to write a report on the issue by 2027.
“The measure began as a plan to give data centers their own electricity rate, shielding households and small businesses from higher bills,” CalMatters recently reported. “It amounts to a ‘toothless’ measure, directing the utility regulator to study an issue it already has the authority to investigate,” the article went on to paraphrase critics’ views on the matter.
Top image via x.ai/colossus
Read rest at OilPrice
















