Business
The CEO of the world’s largest data center company predicts will drive the business forward
Published
7 days agoon
By
Jace Porter
Adaire Fox-Martin understands the needs of Big Tech. Prior to becoming CEO of Equinix (No. 446 on the Fortune 500) last year, she held senior roles at Google, SAP and Oracle. Now, the Irish-born former teacher is driving the expansion of the world’s largest global data center network, with more than 273 data centers in 36 countries. Fox-Martin recently spoke with Fortune about what she learned in her first year in the job and where she wants to go from here.
This interview has been edited and condensed for clarity.
We last met when you were starting out in the role.
It’s been an incredible year of learning and realizing that this job doesn’t come with an instruction manual. You bring the experiences that you’ve had in the past to the decisions that you make for the company for the future. We’ve laid out the strategy and optimized it into 10 simple words. The first of those is “build bolder.” which is how we’re designing and constructing the infrastructure that underpins the digital economy.
The second part of our ten-word strategy is “solve smarter.” This is about how we abstract the complexity of networking and architecture, which is our secret sauce, and render that for our customers, making Equinix the Easy button. The third piece is to “serve better.” Most participants in the data center industry have five or six customers; we have more than 10,000 enterprise customers. So those are the three pillars.
What are the other four words?
Underpinning that, we have “run simpler,” which sounds easy to say and is very hard to do. You’re taking complexity out of your business, looking at systems and processes. And the last piece is our people piece, which is to “grow together,” growing our business with our customers, linking our employee success to our customer success.
Is that a big change?
Equinix has been a company in this segment for 27 years, so we’re one of the long-term players in this industry. And in the next five years, we’re planning to bring on as much capacity as we did in the last 27 years. That’s a big capital investment for us.
Where do you sit in the data-center ecosystem?
I think there’s a general trend to think of data centers as a homogeneous mass of a singular thing. But there are four distinct categories of data centers, and each one has its own nuance and characteristics. We exist in one of those categories. There’s the hyperscale category, the ones built by cloud-service providers, where you see massive investment. The second category is wholesale, where you’re usually building a facility to lease back to one tenant, maybe two, usually supporting (AI) training. The third is enterprise, where big companies like banks want to have their own center structure. And the fourth category is colocation, which is where Equinix sits.
And what are the advantages of that?
Think of us a little like an airport authority. It manages the runaways and the facilities of the airport and gives you the ability to rent ticketing and other kind of facilities in there. Then it manages the process of passenger engagement, so an airline comes in, like KLM, drops a passenger, and then magic happens in the background to move that passenger and their luggage to United to go on to California. We’re a little bit like the airport authority of the internet: a data package comes into Equinix and then moves on to where its next destination is. The difference between us and an airport authority is that the airport lines will compete whereas a lot of our customers colocate so they can collaborate.
What do you do in terms of AI workloads?
We do both training and inference. A pharmaceutical company would do their training privately at Equinix because in the pharma world much of their research and drug discovery processes have to go through private models for regulatory reasons or intellectual property protection. Training is like teaching the model and then inference really putting what the model has learned to work.
What about the energy needs?
The different types of data centers have different characteristics when it comes to energy, who they’re starving, or how they’re supporting local economies and communities.
We’re smack bang in the middle of what I would describe as an energy super cycle. Data centers are one component of it, but so is the electrification of everything. You have the speed of an AI meeting the pace of utilities, and it’s a headfirst collision. We don’t think it’s an insurmountable challenge but it’s going to require collaboration, innovation and time.
How do you seeing it playing out?
Between now and 2028, it’s fair to say there is a power crunch. Anything that we’re delivering until 2028, we understand where our power will come from. From 2028 to 2032, you’ll see an innovation click into the power landscape, in the form of data centers and data center operators looking at how they can self-generate, how they can generate on site, how they can innovate with the grid, and give power back to the grid, how they can be flexible on and off the grid. You’ll see different aspects of innovation, including nuclear, looking at small modular reactors and how they can be utilized.
From 2032 on, the utilities have introduced some changes. In the past, you would go to a utility and say, ‘I want this much here in this time, just-in-time power provision.’ For someone like us, which doesn’t have the same power draw as a hyperscale data center, that was usually good enough. But utilities are looking at their power framework in the form of cluster studies, taking a group of requirements together in a cluster at the same time. You define the load that you’re going to ramp up to and it will likely take the form of take or pay. If you said you’re going to use this much, you will pay for it, whether you use it or not.
It’s important that large energy users, like data centers, pay a premium for what they’re utilizing so that we don’t impact small ratepayers, small energy users, so there’s a lot happening around collaboration. We’ve got a 27-year history of that kind of collaboration with the utilities and so we’re very involved in a number of those processes.
Talk about the challenge of building these centers.
One is supply chain, the things that are needed to construct a data center, some of which have been subject to tariffs. In the short term, that’s not an issue but longer term, that may become something that we have to navigate our way through. And then there’s the workforce, the plumbers and mechanical engineers and welders who are maintaining our environments that keep the internet up. A lot of trade skills, construction skills and technical skills are necessary to create the data center.
Are the centers you’re building for these workloads any larger than the ones that you built in the past?
We do support our hyperscaler partners with the provision of data centers, through a vehicle called xScale, which is a joint venture. We have partners who fund our joint ventures, so we do participate in what I described as the wholesale economy by building what’s called a build-to-suit data center industry for a hyperscaler. So a Google would come to us and say, ‘do you guys have power and land in location X? And would you build for us?’ So we do that through a joint venture off our balance sheet because the capital-intensive nature of that is high. We own 25% of our America JV and we own 20% of our EMEA and our APAC JV. We have 15 centers that are already operational around the globe.
What do you think is underappreciated about your business model?
I think the connectivity of Equinix is underappreciated. We have 270 data centers around the world, so we’re the world’s largest independent data center operator that’s still a public company. People see the physical manifestations of those centers, but the secret sauce is the connections that sit in every single one of those data centers. They take three forms. First is the ability to interconnect a company to another company. We have the trading hubs: 72% of the world’s trading platforms operate on Equinix. You have a trading hub and all their partners located closely to them that need to be literally connected so there’s no latency between the transactions. We have 492,000 deep interconnections between the companies that operate in our centers, between value chains.
The second piece of connectivity is to do with the clouds. They are an exceptionally important part of the technology landscape. Many customers store their data in clouds and most customers store their data in more than one cloud. They spread the love. We have a 35% market share in native cloud on ramps from our data centers. So you can pop into the cloud, get your data and bring it back.
And then the third piece is physically where we’re located. We’re not in the middle of the country. We are in cities, where human beings are with their devices. So many people refer to us as the metro edge, the city edge, the edge where people actually are. So we can connect the cloud, via the metro edge where humans are, to the far edge where devices might be utilized.
Do you think people appreciate the role that data centers play in their lives?
In many countries, we are designated as critical infrastructure, in certain states, too, but not at the federal level. When I think about moving home: water, gas, electricity, internet becomes that fourth utility. And 95% of internet traffic runs through the Equinix environment. If you were on a Zoom call this morning, if you did a stream from any of the major providers, ordered an Uber, purchased a train ticket, you were on a platform accessing Equinix at some point.
“95% of internet traffic runs through the Equinix environment.”Adaire Fox-Martin, CEO, Equinix
What are you seeing in terms of customer trends?
Many of our customers are moving from the proof-of-concept phase of AI into the real-world-application phase of AI. There’s a lot to grapple with in that. It isn’t just about taking a business process and putting AI over the top of it. There are a whole series of considerations around governance and the management of data that haven’t really played into the business picture yet that are very real, especially for industries that are highly regulated.
That’s why some have not even adopted that much AI.
Right. Even if they are frontrunners, now it’s kind of like coming back and saying, ‘oh, how do we make sure that we’re audible, traceable, accountable, all of the things that are good governance for business. If we’re going to deploy a technology that can automate so many things and take my human out of the loop, how do I report, manage, and maintain the governance framework of those processes in my business?
We’re seeing a lot of pushback in local communities where these mega hyperscale data centers are being built. How are you staking your claim to say we’re not that, but this is still critical infrastructure we need?
You look at it through the lens of what are the good things that a data center can do for a local community. We engage very strongly with local communities when we are beginning a construction. You do bring jobs to the area, particularly in the construction face, less so when you’re in the operation face because there isn’t a preponderance of humans across a data center. Second, you’re obviously going to pay tax in that location and that has knock-on benefit. Thirdly, we employ and source locally. I’m very excited about our apprenticeship scheme, where young women and men who maybe didn’t have a formal education path can become data-center technicians or critical facility engineers. And when there’s a build of a data center, there’s often an upgrade of the infrastructure around it, like whether that’s the power capabilities, the roads and so on.
Are people asking more questions about water, energy?
For sure. And we recognize that these are extremely important parts of the life system of our planet. We were the first data center operator to begin reporting on our water usage. When you bring in power, you want to maximize the use of that energy in the deployment of workloads for customers and not just empowering the data center itself. We measure our power and how effective we are in using power. The best way to save energy to use less of it. That’s absolutely an industry standard now.
And water?
Water was never at the same level of investigation or scrutiny as power was. Now, there’s a measure of water-usage effectiveness and we were one of the first to report on that. It’s not as standardized as power and so we’re working in the industry to try and standardize that a little bit more.
In the longer term, data centers will more than likely be cooled by liquid cooling, as opposed to air or evaporative cooling. And liquid cooling, in terms of water use, is a closed-use-loop system. You’re reusing the same water over and over again to cool the chips. The technology itself will become a determinant of sustainability.
All the big tech companies are working to make these models smaller and more efficient. Eventually, they’re going to want to have many little data centers that are colocated. Do you think you’ll benefit from that?
We believe the inference target addressable market, combined with the network, is about $250 billion outside of what the clouds are doing. By 2029, the inference opportunity will be twice the size of training. And that’s why we’re setting ourselves up for this opportunity.
You can think about training as a centralized AI emotion whereas inference is very much a distributed emotion. It will initiate on a device or maybe through voice, or glass, 0r whatever the device is. And it will probably have an agent conduct its orchestra, in terms of instructing other agents to get data from more than one location. That’s why we’ve been very selective about where we built.
You came to this job from Google almost a year and a half ago. Where are you now versus what you were thinking when you came in?
I would say on a journey, not at the destination but heading in the right direction. I’m confident that we have such a unique combination of characteristics—the metro locations, the connectivity, the secret sauce—that we’re ready for prime time. I’m working through the dynamics of some of the negative feelings around data centers. The challenge around energy has been very real in Europe, in particular. There are countries that have just issued a moratorium on data-center builds, like Ireland, my home country, until they can kind of take a breath and understand whether they can do. These problems are absolutely addressable. They’re absolutely surmountable. It’s a time-based issue that’s going to require collaboration and innovation to solve.
What about the regulatory environment? That’s been in flux.
There is a lot of noise on a variety of topics. I’m just working to control the controllable, and carry on the path that we believe for us is the right path. For example, Equinix has some goals around our sustainability narrative. By 2030, we set a goal for ourselves that we would be neutral as it relates to the use of carbon. We’re still on that track. And we’ve set a science-based goal for 2040 to be net zero and we will continue to innovate and work to do that.
It’s not just that we believe there is an opportunity for technology and innovation to exist with good environmental stewardship. Our customers are continuing to ask us for reports on how their usage at Equinix is impacting things that we may be measure.
There’s a lot of what about AI. What will it do? But there’s a where about AI. And we’re like the where of AI. There are physical cables, even under the ocean, and cable trays and billions of wires. If you’re in California, you get to see the history of data centers. The internet will literally be above your head. We have three decades of data center history, from our very first one to our latest one. I never thought I would come into a company where we have 56 active construction projects all around the world.
You may like
Business
Senate Dems’ plan to fix Obamacare premiums adds nearly $300 billion to deficit, CRFB says
Published
16 minutes agoon
December 5, 2025By
Jace Porter
The Committee for a Responsible Federal Budget (CRFB) is a nonpartisan watchdog that regularly estimates how much the U.S. Congress is adding to the $38 trillion national debt.
With enhanced Affordable Care Act (ACA) subsidies due to expire within days, some Senate Democrats are scrambling to protect millions of Americans from getting the unpleasant holiday gift of spiking health insurance premiums. The CRFB says there’s just one problem with the plan: It’s not funded.
“With the national debt as large as the economy and interest payments costing $1 trillion annually, it is absurd to suggest adding hundreds of billions more to the debt,” CRFB President Maya MacGuineas wrote in a statement on Friday afternoon.
The proposal, backed by members of the Senate Democratic caucus, would fully extend the enhanced ACA subsidies for three years, from 2026 through 2028, with no additional income limits on who can qualify. Those subsidies, originally boosted during the pandemic and later renewed, were designed to lower premiums and prevent coverage losses for middle‑ and lower‑income households purchasing insurance on the ACA exchanges.
CRFB estimated that even this three‑year extension alone would add roughly $300 billion to federal deficits over the next decade, largely because the federal government would continue to shoulder a larger share of premium costs while enrollment and subsidy amounts remain elevated. If Congress ultimately moves to make the enhanced subsidies permanent—as many advocates have urged—the total cost could swell to nearly $550 billion in additional borrowing over the next decade.
Reversing recent guardrails
MacGuineas called the Senate bill “far worse than even a debt-financed extension” as it would roll back several “program integrity” measures that were enacted as part of a 2025 reconciliation law and were intended to tighten oversight of ACA subsidies. On top of that, it would be funded by borrowing even more. “This is a bad idea made worse,” MacGuineas added.
The watchdog group’s central critique is that the new Senate plan does not attempt to offset its costs through spending cuts or new revenue and, in their view, goes beyond a simple extension by expanding the underlying subsidy structure.
The legislation would permanently repeal restrictions that eliminated subsidies for certain groups enrolling during special enrollment periods and would scrap rules requiring full repayment of excess advance subsidies and stricter verification of eligibility and tax reconciliation. The bill would also nullify portions of a 2025 federal regulation that loosened limits on the actuarial value of exchange plans and altered how subsidies are calculated, effectively reshaping how generous plans can be and how federal support is determined. CRFB warned these reversals would increase costs further while weakening safeguards designed to reduce misuse and error in the subsidy system.
MacGuineas said that any subsidy extension should be paired with broader reforms to curb health spending and reduce overall borrowing. In her view, lawmakers are missing a chance to redesign ACA support in a way that lowers premiums while also improving the long‑term budget outlook.
The debate over ACA subsidies recently contributed to a government funding standoff, and CRFB argued that the new Senate bill reflects a political compromise that prioritizes short‑term relief over long‑term fiscal responsibility.
“After a pointless government shutdown over this issue, it is beyond disappointing that this is the preferred solution to such an important issue,” MacGuineas wrote.
The off-year elections cast the government shutdown and cost-of-living arguments in a different light. Democrats made stunning gains and almost flipped a deep-red district in Tennessee as politicians from the far left and center coalesced around “affordability.”
Senate Minority Leader Chuck Schumer is reportedly smelling blood in the water and doubling down on the theme heading into the pivotal midterm elections of 2026. President Donald Trump is scheduled to visit Pennsylvania soon to discuss pocketbook anxieties. But he is repeating predecessor Joe Biden’s habit of dismissing inflation, despite widespread evidence to the contrary.
“We fixed inflation, and we fixed almost everything,” Trump said in a Tuesday cabinet meeting, in which he also dismissed affordability as a “hoax” pushed by Democrats.
Lawmakers on both sides of the aisle now face a politically fraught choice: allow premiums to jump sharply—including in swing states like Pennsylvania where ACA enrollees face double‑digit increases—or pass an expensive subsidy extension that would, as CRFB calculates, explode the deficit without addressing underlying health care costs.
Business
Netflix–Warner Bros. deal sets up $72 billion antitrust test
Published
47 minutes agoon
December 5, 2025By
Jace Porter
Netflix Inc. has won the heated takeover battle for Warner Bros. Discovery Inc. Now it must convince global antitrust regulators that the deal won’t give it an illegal advantage in the streaming market.
The $72 billion tie-up joins the world’s dominant paid streaming service with one of Hollywood’s most iconic movie studios. It would reshape the market for online video content by combining the No. 1 streaming player with the No. 4 service HBO Max and its blockbuster hits such as Game Of Thrones, Friends, and the DC Universe comics characters franchise.
That could raise red flags for global antitrust regulators over concerns that Netflix would have too much control over the streaming market. The company faces a lengthy Justice Department review and a possible US lawsuit seeking to block the deal if it doesn’t adopt some remedies to get it cleared, analysts said.
“Netflix will have an uphill climb unless it agrees to divest HBO Max as well as additional behavioral commitments — particularly on licensing content,” said Bloomberg Intelligence analyst Jennifer Rie. “The streaming overlap is significant,” she added, saying the argument that “the market should be viewed more broadly is a tough one to win.”
By choosing Netflix, Warner Bros. has jilted another bidder, Paramount Skydance Corp., a move that risks touching off a political battle in Washington. Paramount is backed by the world’s second-richest man, Larry Ellison, and his son, David Ellison, and the company has touted their longstanding close ties to President Donald Trump. Their acquisition of Paramount, which closed in August, has won public praise from Trump.
Comcast Corp. also made a bid for Warner Bros., looking to merge it with its NBCUniversal division.
The Justice Department’s antitrust division, which would review the transaction in the US, could argue that the deal is illegal on its face because the combined market share would put Netflix well over a 30% threshold.
The White House, the Justice Department and Comcast didn’t immediately respond to requests for comment.
US lawmakers from both parties, including Republican Representative Darrell Issa and Democratic Senator Elizabeth Warren have already faulted the transaction — which would create a global streaming giant with 450 million users — as harmful to consumers.
“This deal looks like an anti-monopoly nightmare,” Warren said after the Netflix announcement. Utah Senator Mike Lee, a Republican, said in a social media post earlier this week that a Warner Bros.-Netflix tie-up would raise more serious competition questions “than any transaction I’ve seen in about a decade.”
European Union regulators are also likely to subject the Netflix proposal to an intensive review amid pressure from legislators. In the UK, the deal has already drawn scrutiny before the announcement, with House of Lords member Baroness Luciana Berger pressing the government on how the transaction would impact competition and consumer prices.
The combined company could raise prices and broadly impact “culture, film, cinemas and theater releases,”said Andreas Schwab, a leading member of the European Parliament on competition issues, after the announcement.
Paramount has sought to frame the Netflix deal as a non-starter. “The simple truth is that a deal with Netflix as the buyer likely will never close, due to antitrust and regulatory challenges in the United States and in most jurisdictions abroad,” Paramount’s antitrust lawyers wrote to their counterparts at Warner Bros. on Dec. 1.
Appealing directly to Trump could help Netflix avoid intense antitrust scrutiny, New Street Research’s Blair Levin wrote in a note on Friday. Levin said it’s possible that Trump could come to see the benefit of switching from a pro-Paramount position to a pro-Netflix position. “And if he does so, we believe the DOJ will follow suit,” Levin wrote.
Netflix co-Chief Executive Officer Ted Sarandos had dinner with Trump at the president’s Mar-a-Lago resort in Florida last December, a move other CEOs made after the election in order to win over the administration. In a call with investors Friday morning, Sarandos said that he’s “highly confident in the regulatory process,” contending the deal favors consumers, workers and innovation.
“Our plans here are to work really closely with all the appropriate governments and regulators, but really confident that we’re going to get all the necessary approvals that we need,” he said.
Netflix will likely argue to regulators that other video services such as Google’s YouTube and ByteDance Ltd.’s TikTok should be included in any analysis of the market, which would dramatically shrink the company’s perceived dominance.
The US Federal Communications Commission, which regulates the transfer of broadcast-TV licenses, isn’t expected to play a role in the deal, as neither hold such licenses. Warner Bros. plans to spin off its cable TV division, which includes channels such as CNN, TBS and TNT, before the sale.
Even if antitrust reviews just focus on streaming, Netflix believes it will ultimately prevail, pointing to Amazon.com Inc.’s Prime and Walt Disney Co. as other major competitors, according to people familiar with the company’s thinking.
Netflix is expected to argue that more than 75% of HBO Max subscribers already subscribe to Netflix, making them complementary offerings rather than competitors, said the people, who asked not to be named discussing confidential deliberations. The company is expected to make the case that reducing its content costs through owning Warner Bros., eliminating redundant back-end technology and bundling Netflix with Max will yield lower prices.
Business
The rise of AI reasoning models comes with a big energy tradeoff
Published
1 hour agoon
December 5, 2025By
Jace Porter
Nearly all leading artificial intelligence developers are focused on building AI models that mimic the way humans reason, but new research shows these cutting-edge systems can be far more energy intensive, adding to concerns about AI’s strain on power grids.
AI reasoning models used 30 times more power on average to respond to 1,000 written prompts than alternatives without this reasoning capability or which had it disabled, according to a study released Thursday. The work was carried out by the AI Energy Score project, led by Hugging Face research scientist Sasha Luccioni and Salesforce Inc. head of AI sustainability Boris Gamazaychikov.
The researchers evaluated 40 open, freely available AI models, including software from OpenAI, Alphabet Inc.’s Google and Microsoft Corp. Some models were found to have a much wider disparity in energy consumption, including one from Chinese upstart DeepSeek. A slimmed-down version of DeepSeek’s R1 model used just 50 watt hours to respond to the prompts when reasoning was turned off, or about as much power as is needed to run a 50 watt lightbulb for an hour. With the reasoning feature enabled, the same model required 7,626 watt hours to complete the tasks.
The soaring energy needs of AI have increasingly come under scrutiny. As tech companies race to build more and bigger data centers to support AI, industry watchers have raised concerns about straining power grids and raising energy costs for consumers. A Bloomberg investigation in September found that wholesale electricity prices rose as much as 267% over the past five years in areas near data centers. There are also environmental drawbacks, as Microsoft, Google and Amazon.com Inc. have previously acknowledged the data center buildout could complicate their long-term climate objectives.
More than a year ago, OpenAI released its first reasoning model, called o1. Where its prior software replied almost instantly to queries, o1 spent more time computing an answer before responding. Many other AI companies have since released similar systems, with the goal of solving more complex multistep problems for fields like science, math and coding.
Though reasoning systems have quickly become the industry norm for carrying out more complicated tasks, there has been little research into their energy demands. Much of the increase in power consumption is due to reasoning models generating much more text when responding, the researchers said.
The new report aims to better understand how AI energy needs are evolving, Luccioni said. She also hopes it helps people better understand that there are different types of AI models suited to different actions. Not every query requires tapping the most computationally intensive AI reasoning systems.
“We should be smarter about the way that we use AI,” Luccioni said. “Choosing the right model for the right task is important.”
To test the difference in power use, the researchers ran all the models on the same computer hardware. They used the same prompts for each, ranging from simple questions — such as asking which team won the Super Bowl in a particular year — to more complex math problems. They also used a software tool called CodeCarbon to track how much energy was being consumed in real time.
The results varied considerably. The researchers found one of Microsoft’s Phi 4 reasoning models used 9,462 watt hours with reasoning turned on, compared with about 18 watt hours with it off. OpenAI’s largest gpt-oss model, meanwhile, had a less stark difference. It used 8,504 watt hours with reasoning on the most computationally intensive “high” setting and 5,313 watt hours with the setting turned down to “low.”
OpenAI, Microsoft, Google and DeepSeek did not immediately respond to a request for comment.
Google released internal research in August that estimated the median text prompt for its Gemini AI service used 0.24 watt-hours of energy, roughly equal to watching TV for less than nine seconds. Google said that figure was “substantially lower than many public estimates.”
Much of the discussion about AI power consumption has focused on large-scale facilities set up to train artificial intelligence systems. Increasingly, however, tech firms are shifting more resources to inference, or the process of running AI systems after they’ve been trained. The push toward reasoning models is a big piece of that as these systems are more reliant on inference.
Recently, some tech leaders have acknowledged that AI’s power draw needs to be reckoned with. Microsoft CEO Satya Nadella said the industry must earn the “social permission to consume energy” for AI data centers in a November interview. To do that, he argued tech must use AI to do good and foster broad economic growth.
Matt Campbell set to take over in Happy Valley
Senate Dems’ plan to fix Obamacare premiums adds nearly $300 billion to deficit, CRFB says
White House Doubles Down On Trolling Sabrina Carpenter
Trending
-
Politics8 years agoCongress rolls out ‘Better Deal,’ new economic agenda
-
Entertainment8 years agoNew Season 8 Walking Dead trailer flashes forward in time
-
Politics8 years agoPoll: Virginia governor’s race in dead heat
-
Entertainment8 years agoThe final 6 ‘Game of Thrones’ episodes might feel like a full season
-
Entertainment8 years agoMeet Superman’s grandfather in new trailer for Krypton
-
Politics8 years agoIllinois’ financial crisis could bring the state to a halt
-
Business8 years ago6 Stunning new co-working spaces around the globe
-
Tech8 years agoHulu hires Google marketing veteran Kelly Campbell as CMO
