Politics
The AI revolution comes with a huge utility bill

Artificial Intelligence dominated headlines last year and continues to do so this year, albeit for varied reasons. Last year’s focus was on its capabilities, as organizations and individuals began to understand how AI could enhance their personal and professional lives.
Amid this excitement, doomsday predictions about AI’s potential to obliterate humanity also emerged (thanks, James Cameron and Cyberdyne Systems).
There were amusing yet absurd trends, such as the viral phenomenon of turning parts of the United States into action figures. While some of these trends were entertaining, most were not. The bottom line is that we are witnessing unprecedented growth, usage, and resource consumption comparable to the Industrial Revolution — perhaps we should call it the Intelligence Revolution.
This year, we already see the impact of millions of professionals harnessing AI’s power. AI is not magic; it relies on tangible technology and vast resources, including servers, electricity, water, memory, chips, and data centers. When 50 million people tap into massive computing power to transform themselves into action figures, resources become strained. This is our reality.
As resources become scarce, prices inevitably rise. I wish I had paid more attention in Economics class. In my time at UF, Eco101 was available on VHS, making the university an early adopter of virtual learning, as I was mostly virtual in 1992. Regardless, we will dive into the essential resources, starting with water. A massive data center can consume up to 5 million gallons a day.
According to the Lincoln Institute of Land Policy, Texas data centers consumed approximately 50 billion gallons of water last year. At a smaller scale, a 20-40-query ChatGPT conversation consumes the equivalent of a 16-ounce bottle of water. The scale of water usage, both small and large, is staggering.
Next, let’s discuss electricity.
Data centers in the United States are estimated to consume 5-10% of the nation’s power. In the spirit of an economics lesson, let’s clarify what data centers are: large facilities housing rows of servers, data storage systems, and networking equipment, along with power and cooling systems essential for their operation.
This infrastructure underpins companies providing digital services. Every time you send an email, stream a show, save a photo to “the cloud,” or engage with a chatbot, you’re interacting with a data center.
I expected Florida to be a leading provider of data centers (such as for elections, football, and vacations), but it turns out Texas (300+), Virginia (600+), and California (300+) are the top states for data centers in the U.S. The total number of data centers in the U.S. is approximately 5,000, accounting for 40% of the global market, while Florida has approximately 100 operational centers (Data Center Map).
Note that many companies claim to have a “data center,” but simply having a server in a closet doesn’t qualify. However, a large data center may soon be coming to Florida, potentially boosting our ranking (Developer Plans Data Center).
Moving on to memory and chips (not to be confused with the Chips TV Intro).
A memory chip is an integrated circuit that contains millions of tiny transistors and capacitors and is used to store, retrieve, and manage data in electronic devices. These chips can be either RAM (short-term) or Flash/ROM (long-term) memory, and they are essential for running software, storing files, and enabling technology.
Data centers require substantial memory, particularly high-bandwidth memory (HBM) for AI chips (Nvidia, AMD, Google), which is currently straining global supply. Prices for computer memory are expected to rise by over 50% in early 2026, and we are already witnessing this increase.
HBM chips are much more complex than the RAM used in consumer laptops and smartphones. Designed for high-bandwidth requirements, HBM is fabricated through a complex process that stacks 12 to 16 memory layers on a single chip, forming a “cube.”
When a company like Micron manufactures one bit of HBM memory, it must forgo producing more conventional memory for other devices.
As you shop for new PCs, be prepared for price increases of around 25%. In my 20+ years in technology, I have only seen a spike like this once, due to production decreases during the pandemic. This situation is different, and its effects will be felt throughout the year (PCWorld).
Now let’s address rare earth metals, which seemed like a problem for 2024, but it hasn’t gone away; it’s just become more manageable. Rare earth elements (REEs) are a group of 17 metallic elements crucial for high-tech applications, including electric vehicles, wind turbines, and electronics. The primary issue is that China accounts for more than half of the global supply.
In the United States, companies are seeking alternatives. For instance, Mosaic in Florida is exploring waste mining to reclaim REEs from leftover materials from phosphate mining. They are also investigating alternative uses, including road-building materials. Although this process appears costly, we hope it will yield results as we explore options beyond importing.
In Tallahassee, I discussed AI with Eduardo Gonzalez Loumiet, a partner at Ruvos, a global software company operating data centers worldwide. He stated, “The question isn’t whether artificial intelligence is coming; it’s already here, and its use will only continue to accelerate. As a region anchored by education, government, and an increasingly innovative private sector, we must be honest about what that reality demands. Widespread AI adoption requires real investment in infrastructure — power, connectivity, data capacity, and resilience — so our institutions can use these tools responsibly and at scale.
“We already see this firsthand. At Ruvos, artificial intelligence is being applied in health care not only across Florida but also globally, supporting public health, clinical systems, and decision-making at unprecedented levels. If Tallahassee wants to remain competitive and effective as AI becomes foundational to how work gets done, we must invest now in the infrastructure that supports both innovation and long-term sustainability.”
Thank you, Eddie, for your insights and your service as Chair of the Tallahassee Chamber of Commerce.
AI may lead to short-term computational and business challenges, but it’s not the apocalyptic scenario that John and Sarah Connor warned us about. While they predicted AI might destroy the world, what we are experiencing is a drain on our resources. We should consider slowing down the use of ChatGPT for action figure creations and resist asking Co-Pilot to compile all your 2025 emails into a summary, as these actions contribute to a domino effect. This scenario is certainly applicable to the digital mini-apocalypse we are experiencing today. A correction is possible, and balance can be restored.
We — all of us — must work towards this, remembering Sarah Connor’s words:
“There is no fate, but what we make.”
We’d better get to work to ensure we harness our human intelligence effectively in this digital intelligence revolution, leaving the action figures to Hasbro.