Adaire Fox-Martin understands the needs of Big Tech. Prior to becoming CEO of Equinix (No. 446 on the Fortune 500) last year, she held senior roles at Google, SAP and Oracle. Now, the Irish-born former teacher is driving the expansion of the world’s largest global data center network, with more than 273 data centers in 36 countries. Fox-Martin recently spoke with Fortune about what she learned in her first year in the job and where she wants to go from here.
This interview has been edited and condensed for clarity.
We last met when you were starting out in the role.
It’s been an incredible year of learning and realizing that this job doesn’t come with an instruction manual. You bring the experiences that you’ve had in the past to the decisions that you make for the company for the future. We’ve laid out the strategy and optimized it into 10 simple words. The first of those is “build bolder.” which is how we’re designing and constructing the infrastructure that underpins the digital economy.
The second part of our ten-word strategy is “solve smarter.” This is about how we abstract the complexity of networking and architecture, which is our secret sauce, and render that for our customers, making Equinix the Easy button. The third piece is to “serve better.” Most participants in the data center industry have five or six customers; we have more than 10,000 enterprise customers. So those are the three pillars.
What are the other four words?
Underpinning that, we have “run simpler,” which sounds easy to say and is very hard to do. You’re taking complexity out of your business, looking at systems and processes. And the last piece is our people piece, which is to “grow together,” growing our business with our customers, linking our employee success to our customer success.
Is that a big change?
Equinix has been a company in this segment for 27 years, so we’re one of the long-term players in this industry. And in the next five years, we’re planning to bring on as much capacity as we did in the last 27 years. That’s a big capital investment for us.
Where do you sit in the data-center ecosystem?
I think there’s a general trend to think of data centers as a homogeneous mass of a singular thing. But there are four distinct categories of data centers, and each one has its own nuance and characteristics. We exist in one of those categories. There’s the hyperscale category, the ones built by cloud-service providers, where you see massive investment. The second category is wholesale, where you’re usually building a facility to lease back to one tenant, maybe two, usually supporting (AI) training. The third is enterprise, where big companies like banks want to have their own center structure. And the fourth category is colocation, which is where Equinix sits.
And what are the advantages of that?
Think of us a little like an airport authority. It manages the runaways and the facilities of the airport and gives you the ability to rent ticketing and other kind of facilities in there. Then it manages the process of passenger engagement, so an airline comes in, like KLM, drops a passenger, and then magic happens in the background to move that passenger and their luggage to United to go on to California. We’re a little bit like the airport authority of the internet: a data package comes into Equinix and then moves on to where its next destination is. The difference between us and an airport authority is that the airport lines will compete whereas a lot of our customers colocate so they can collaborate.
What do you do in terms of AI workloads?
We do both training and inference. A pharmaceutical company would do their training privately at Equinix because in the pharma world much of their research and drug discovery processes have to go through private models for regulatory reasons or intellectual property protection. Training is like teaching the model and then inference really putting what the model has learned to work.
What about the energy needs?
The different types of data centers have different characteristics when it comes to energy, who they’re starving, or how they’re supporting local economies and communities.
We’re smack bang in the middle of what I would describe as an energy super cycle. Data centers are one component of it, but so is the electrification of everything. You have the speed of an AI meeting the pace of utilities, and it’s a headfirst collision. We don’t think it’s an insurmountable challenge but it’s going to require collaboration, innovation and time.
How do you seeing it playing out?
Between now and 2028, it’s fair to say there is a power crunch. Anything that we’re delivering until 2028, we understand where our power will come from. From 2028 to 2032, you’ll see an innovation click into the power landscape, in the form of data centers and data center operators looking at how they can self-generate, how they can generate on site, how they can innovate with the grid, and give power back to the grid, how they can be flexible on and off the grid. You’ll see different aspects of innovation, including nuclear, looking at small modular reactors and how they can be utilized.
From 2032 on, the utilities have introduced some changes. In the past, you would go to a utility and say, ‘I want this much here in this time, just-in-time power provision.’ For someone like us, which doesn’t have the same power draw as a hyperscale data center, that was usually good enough. But utilities are looking at their power framework in the form of cluster studies, taking a group of requirements together in a cluster at the same time. You define the load that you’re going to ramp up to and it will likely take the form of take or pay. If you said you’re going to use this much, you will pay for it, whether you use it or not.
It’s important that large energy users, like data centers, pay a premium for what they’re utilizing so that we don’t impact small ratepayers, small energy users, so there’s a lot happening around collaboration. We’ve got a 27-year history of that kind of collaboration with the utilities and so we’re very involved in a number of those processes.
Talk about the challenge of building these centers.
One is supply chain, the things that are needed to construct a data center, some of which have been subject to tariffs. In the short term, that’s not an issue but longer term, that may become something that we have to navigate our way through. And then there’s the workforce, the plumbers and mechanical engineers and welders who are maintaining our environments that keep the internet up. A lot of trade skills, construction skills and technical skills are necessary to create the data center.
Are the centers you’re building for these workloads any larger than the ones that you built in the past?
We do support our hyperscaler partners with the provision of data centers, through a vehicle called xScale, which is a joint venture. We have partners who fund our joint ventures, so we do participate in what I described as the wholesale economy by building what’s called a build-to-suit data center industry for a hyperscaler. So a Google would come to us and say, ‘do you guys have power and land in location X? And would you build for us?’ So we do that through a joint venture off our balance sheet because the capital-intensive nature of that is high. We own 25% of our America JV and we own 20% of our EMEA and our APAC JV. We have 15 centers that are already operational around the globe.
What do you think is underappreciated about your business model?
I think the connectivity of Equinix is underappreciated. We have 270 data centers around the world, so we’re the world’s largest independent data center operator that’s still a public company. People see the physical manifestations of those centers, but the secret sauce is the connections that sit in every single one of those data centers. They take three forms. First is the ability to interconnect a company to another company. We have the trading hubs: 72% of the world’s trading platforms operate on Equinix. You have a trading hub and all their partners located closely to them that need to be literally connected so there’s no latency between the transactions. We have 492,000 deep interconnections between the companies that operate in our centers, between value chains.
The second piece of connectivity is to do with the clouds. They are an exceptionally important part of the technology landscape. Many customers store their data in clouds and most customers store their data in more than one cloud. They spread the love. We have a 35% market share in native cloud on ramps from our data centers. So you can pop into the cloud, get your data and bring it back.
And then the third piece is physically where we’re located. We’re not in the middle of the country. We are in cities, where human beings are with their devices. So many people refer to us as the metro edge, the city edge, the edge where people actually are. So we can connect the cloud, via the metro edge where humans are, to the far edge where devices might be utilized.
Do you think people appreciate the role that data centers play in their lives?
In many countries, we are designated as critical infrastructure, in certain states, too, but not at the federal level. When I think about moving home: water, gas, electricity, internet becomes that fourth utility. And 95% of internet traffic runs through the Equinix environment. If you were on a Zoom call this morning, if you did a stream from any of the major providers, ordered an Uber, purchased a train ticket, you were on a platform accessing Equinix at some point.
“95% of internet traffic runs through the Equinix environment.”Adaire Fox-Martin, CEO, Equinix
What are you seeing in terms of customer trends?
Many of our customers are moving from the proof-of-concept phase of AI into the real-world-application phase of AI. There’s a lot to grapple with in that. It isn’t just about taking a business process and putting AI over the top of it. There are a whole series of considerations around governance and the management of data that haven’t really played into the business picture yet that are very real, especially for industries that are highly regulated.
That’s why some have not even adopted that much AI.
Right. Even if they are frontrunners, now it’s kind of like coming back and saying, ‘oh, how do we make sure that we’re audible, traceable, accountable, all of the things that are good governance for business. If we’re going to deploy a technology that can automate so many things and take my human out of the loop, how do I report, manage, and maintain the governance framework of those processes in my business?
We’re seeing a lot of pushback in local communities where these mega hyperscale data centers are being built. How are you staking your claim to say we’re not that, but this is still critical infrastructure we need?
You look at it through the lens of what are the good things that a data center can do for a local community. We engage very strongly with local communities when we are beginning a construction. You do bring jobs to the area, particularly in the construction face, less so when you’re in the operation face because there isn’t a preponderance of humans across a data center. Second, you’re obviously going to pay tax in that location and that has knock-on benefit. Thirdly, we employ and source locally. I’m very excited about our apprenticeship scheme, where young women and men who maybe didn’t have a formal education path can become data-center technicians or critical facility engineers. And when there’s a build of a data center, there’s often an upgrade of the infrastructure around it, like whether that’s the power capabilities, the roads and so on.
Are people asking more questions about water, energy?
For sure. And we recognize that these are extremely important parts of the life system of our planet. We were the first data center operator to begin reporting on our water usage. When you bring in power, you want to maximize the use of that energy in the deployment of workloads for customers and not just empowering the data center itself. We measure our power and how effective we are in using power. The best way to save energy to use less of it. That’s absolutely an industry standard now.
And water?
Water was never at the same level of investigation or scrutiny as power was. Now, there’s a measure of water-usage effectiveness and we were one of the first to report on that. It’s not as standardized as power and so we’re working in the industry to try and standardize that a little bit more.
In the longer term, data centers will more than likely be cooled by liquid cooling, as opposed to air or evaporative cooling. And liquid cooling, in terms of water use, is a closed-use-loop system. You’re reusing the same water over and over again to cool the chips. The technology itself will become a determinant of sustainability.
All the big tech companies are working to make these models smaller and more efficient. Eventually, they’re going to want to have many little data centers that are colocated. Do you think you’ll benefit from that?
We believe the inference target addressable market, combined with the network, is about $250 billion outside of what the clouds are doing. By 2029, the inference opportunity will be twice the size of training. And that’s why we’re setting ourselves up for this opportunity.
You can think about training as a centralized AI emotion whereas inference is very much a distributed emotion. It will initiate on a device or maybe through voice, or glass, 0r whatever the device is. And it will probably have an agent conduct its orchestra, in terms of instructing other agents to get data from more than one location. That’s why we’ve been very selective about where we built.
You came to this job from Google almost a year and a half ago. Where are you now versus what you were thinking when you came in?
I would say on a journey, not at the destination but heading in the right direction. I’m confident that we have such a unique combination of characteristics—the metro locations, the connectivity, the secret sauce—that we’re ready for prime time. I’m working through the dynamics of some of the negative feelings around data centers. The challenge around energy has been very real in Europe, in particular. There are countries that have just issued a moratorium on data-center builds, like Ireland, my home country, until they can kind of take a breath and understand whether they can do. These problems are absolutely addressable. They’re absolutely surmountable. It’s a time-based issue that’s going to require collaboration and innovation to solve.
What about the regulatory environment? That’s been in flux.
There is a lot of noise on a variety of topics. I’m just working to control the controllable, and carry on the path that we believe for us is the right path. For example, Equinix has some goals around our sustainability narrative. By 2030, we set a goal for ourselves that we would be neutral as it relates to the use of carbon. We’re still on that track. And we’ve set a science-based goal for 2040 to be net zero and we will continue to innovate and work to do that.
It’s not just that we believe there is an opportunity for technology and innovation to exist with good environmental stewardship. Our customers are continuing to ask us for reports on how their usage at Equinix is impacting things that we may be measure.
There’s a lot of what about AI. What will it do? But there’s a where about AI. And we’re like the where of AI. There are physical cables, even under the ocean, and cable trays and billions of wires. If you’re in California, you get to see the history of data centers. The internet will literally be above your head. We have three decades of data center history, from our very first one to our latest one. I never thought I would come into a company where we have 56 active construction projects all around the world.
In 2021, Mark Zuckerberg recast Facebook as Meta and declared the metaverse — a digital realm where people would work, socialize, and spend much of their lives — the company’s next great frontier. He framed it as the “successor to the mobile internet” and said Meta would be “metaverse-first.”
The hype wasn’t all him. Grayscale, the investment firm specializing in crypto, called the Metaverse a “trillion-dollar revenue opportunity.” Barbados even opened up an embassy in Decentraland, one of the worlds in the metaverse.
Five years later, that bet has become one of the most expensive misadventures in tech. Meta’s Reality Labs division has racked up more than $70 billion in losses since 2021, according to Bloomberg, burning through cash on blocky virtual environments, glitchy avatars, expensive headsets, and a user base of approximately 38 people as of 2022.
For many people, the problem is that the value proposition is unclear; the metaverse simply doesn’t yet deliver a must-have reason to ditch their phone or laptop. Despite years of investment, VR remains burdened by serious structural limitations, and for most users there’s simply not enough compelling content beyond niche gaming.
A 30% budget cut
Zuckerberg is now preparing to slash Reality Labs’ budget by as much as 30%, Bloomberg said. The cuts—which could translate to $4 billion to $6 billion in reduced spend—would hit everything from the Horizon Worlds virtual platform to the Quest hardware unit. Layoffs could come as early as January, though final decisions haven’t been made, according to Bloomberg.
The move follows a strategy meeting last month at Zuckerberg’s Hawaii compound, where he reviewed Meta’s 2026 budget and asked executives to find 10% cuts across the board, the report said. Reality Labs was told to go deeper. Competition in the broader VR market simply never took off the way Meta expected, one person said. The result: a division long viewed as a money sink is finally being reined in.
Wall Street cheered. Meta’s stock jumped more than 4% Thursday on the news, adding roughly $69 billion in market value.
“Smart move, just late,” Craig Huber of Huber Research told Reuters. Investors have been complaining for years that the metaverse effort was an expensive distraction, one that drained resources without producing meaningful revenue.
Metaverse out, AI in
Meta didn’t immediately respond to Fortune’s request for comment, but it insists it isn’t killing the metaverse outright. A spokesperson told the South China Morning Post that the company is “shifting some investment from Metaverse toward AI glasses and wearables,” pointing to momentum behind its Ray-Ban smart glasses, which Zuckerberg says have tripled in sales over the past year.
But there’s no avoiding the reality: AI is the new obsession, and the new money pit.
Meta expects to spend around $72 billion on AI this year, nearly matching everything it has lost on the metaverse since 2021. That includes massive outlays for data centers, model development, and new hardware. Investors are much more excited about AI burn than metaverse burn, but even they want clarity on how much Meta will ultimately be spending — and for how long.
Across tech, companies are evaluating anything that isn’t directly tied to AI. Apple is revamping its leadership structure, partially around AI concerns. Microsoft is rethinking the “economics of AI.” Amazon, Google, and Microsoft are pouring billions into cloud infrastructure to keep up with demand. Signs point to money-losing initiatives without a clear AI angle being on the chopping block, with Meta as a dramatic example.
On the company’s most recent earnings call, executives didn’t use the word “metaverse” once.
HHS billed the plan as a “first step” focused largely on making its work more efficient and coordinating AI adoption across divisions. But the 20-page document also teased some grander plans to promote AI innovation, including in the analysis of patient health data and in drug development.
“For too long, our Department has been bogged down by bureaucracy and busy-work,” Deputy HHS Secretary Jim O’Neill wrote in an introduction to the strategy. “It is time to tear down these barriers to progress and unite in our use of technology to Make America Healthy Again.”
The new strategy signals how leaders across the Trump administration have embraced AI innovation, encouraging employees across the federal workforce to use chatbots and AI assistants for their daily tasks. As generative AI technology made significant leaps under President Joe Biden’s administration, he issued an executive order to establish guardrails for their use. But when President Donald Trump came into office, he repealed that order and his administration has sought to remove barriers to the use of AI across the federal government.
Experts said the administration’s willingness to modernize government operations presents both opportunities and risks. Some said that AI innovation within HHS demanded rigorous standards because it was dealing with sensitive data and questioned whether those would be met under the leadership of Health Secretary Robert F. Kennedy Jr. Some in Kennedy’s own “Make America Health Again” movement have also voiced concerns about tech companies having access to people’s personal information.
Strategy encourages AI use across the department
HHS’s new plan calls for embracing a “try-first” culture to help staff become more productive and capable through the use of AI. Earlier this year, HHS made the popular AI model ChatGPT available to every employee in the department.
The document identifies five key pillars for its AI strategy moving forward, including creating a governance structure that manages risk, designing a suite of AI resources for use across the department, empowering employees to use AI tools, funding programs to set standards for the use of AI in research and development and incorporating AI in public health and patient care.
It says HHS divisions are already working on promoting the use of AI “to deliver personalized, context-aware health guidance to patients by securely accessing and interpreting their medical records in real time.” Some in Kennedy’s Make America Healthy Again movement have expressed concerns about the use of AI tools to analyze health data and say they aren’t comfortable with the U.S. health department working with big tech companies to access people’s personal information.
Experts question how the department will ensure sensitive medical data is protected
Oren Etzioni, an artificial intelligence expert who founded a nonprofit to fight political deepfakes, said HHS’s enthusiasm for using AI in health care was worth celebrating but warned that speed shouldn’t come at the expense of safety.
“The HHS strategy lays out ambitious goals — centralized data infrastructure, rapid deployment of AI tools, and an AI-enabled workforce — but ambition brings risk when dealing with the most sensitive data Americans have: their health information,” he said.
Etzioni said the strategy’s call for “gold standard science,” risk assessments and transparency in AI development appear to be positive signs. But he said he doubted whether HHS could meet those standards under the leadership of Kennedy, who he said has often flouted rigor and scientific principles.
Darrell West, senior fellow in the Brooking Institution’s Center for Technology Innovation, noted the document promises to strengthen risk management but doesn’t include detailed information about how that will be done.
“There are a lot of unanswered questions about how sensitive medical information will be handled and the way data will be shared,” he said. “There are clear safeguards in place for individual records, but not as many protections for aggregated information being analyzed by AI tools. I would like to understand how officials plan to balance the use of medical information to improve operations with privacy protections that safeguard people’s personal information.”
Still, West, said, if done carefully, “this could become a transformative example of a modernized agency that performs at a much higher level than before.”
The strategy says HHS had 271 active or planned AI implementations in the 2024 financial year, a number it projects will increase by 70% in 2025.
Big Tech’s AI arms race is fueling a massive investment surge in data centers with construction worker labor valued at a premium.
Despite some concerns of an AI bubble, data center hyperscalers like Google, Amazon, and Meta continue to invest heavily into AI infrastructure. In effect, construction workers’ salaries are being inflated to satisfy a seemingly insatiable AI demand, experts tell Fortune.
In 2026 alone, upwards of $100 billion could be invested by tech companies into the data center buildout in the U.S., Raul Martynek, the CEO of DataBank, a company that contracts with tech giants to construct data centers, told Fortune.
In November, Bank of Americaestimated global hyperscale spending is rising 67% in 2025 and another 31% in 2026, totaling a massive $611 billion investment for the AI buildout in just two years.
Given the high demand, construction workers are experiencing a pay bump for data center projects.
Construction projects generally operate on tight margins, with clients being very cost-conscious, Fraser Patterson, CEO of Skillit, an AI-powered hiring platform for construction workers, told Fortune.
But some of the top 50 contractors by size in the country have seen their revenue double in a 12-month period based on data center construction, which is allowing them to pay their workers more, according to Patterson.
“Because of the huge demand and the nature of this construction work, which is fueling the arms race of AI… the budgets are not as tight,” he said. “I would say they’re a little more frothy.”
On Skillit, the average salary for construction projects that aren’t building data centers is $62,000, or $29.80 an hour, Patterson said. The workers that use the platform comprise 40 different trades and have a wide range of experience from heavy equipment operators to electricians, with eight years as the average years of experience.
But when it comes to data centers, the same workers make an average salary of $81,800 or $39.33 per hour, Patterson said, increasing salaries by just under 32% on average.
Some construction workers are even hitting the six-figure mark after their salaries rose for data center projects, according to The Wall Street Journal. And the data center boom doesn’t show any signs it’s slowing down anytime soon.
Tech companies like Google, Amazon, and Microsoft operate 522 data centers and are developing 411 more, according to The Wall Street Journal, citing data from Synergy Research Group.
Patterson said construction workers are being paid more to work on building data centers in part due to condensed project timelines, which require complex coordination or machinery and skilled labor.
Projects that would usually take a couple of years to finish are being completed—in some instances—as quickly as six months, he said.
It is unclear how long the data center boom might last, but Patterson said it has in part convinced a growing number of Gen Z workers and recent college grads to choose construction trades as their career path.
“AI is creating a lot of job anxiety around knowledge workers,” Patterson said. “Construction work is, by definition, very hard to automate.”
“I think you’re starting to see a change in the labor market,” he added.