Connect with us

Business

The CEO of the world’s largest data center company predicts will drive the business forward

Published

on



Adaire Fox-Martin understands the needs of Big Tech. Prior to becoming CEO of Equinix (No. 446 on the Fortune 500) last year, she held senior roles at Google, SAP and Oracle. Now, the Irish-born former teacher is driving the expansion of the world’s largest global data center network, with more than 273 data centers in 36 countries. Fox-Martin recently spoke with Fortune about what she learned in her first year in the job and where she wants to go from here. 

This interview has been edited and condensed for clarity. 

We last met when you were starting out in the role.

It’s been an incredible year of learning and realizing that this job doesn’t come with an instruction manual. You bring the experiences that you’ve had in the past to the decisions that you make for the company for the future. We’ve laid out the strategy and optimized it into 10 simple words. The first of those is “build bolder.” which is how we’re designing and constructing the infrastructure that underpins the digital economy.

The second part of our ten-word strategy is “solve smarter.” This is about how we abstract the complexity of networking and architecture, which is our secret sauce, and render that for our customers, making Equinix the Easy button. The third piece is to “serve better.” Most participants in the data center industry have five or six customers; we have more than 10,000 enterprise customers. So those are the three pillars. 

What are the other four words?

Underpinning that, we have “run simpler,” which sounds easy to say and is very hard to do. You’re taking complexity out of your business, looking at systems and processes. And the last piece is our people piece, which is to “grow together,” growing our business with our customers, linking our employee success to our customer success. 

Is that a big change?

Equinix has been a company in this segment for 27 years, so we’re one of the long-term players in this industry. And in the next five years, we’re planning to bring on as much capacity as we did in the last 27 years. That’s a big capital investment for us. 

Where do you sit in the data-center ecosystem?

I think there’s a general trend to think of data centers as a homogeneous mass of a singular thing. But there are four distinct categories of data centers, and each one has its own nuance and characteristics. We exist in one of those categories. There’s the hyperscale category, the ones built by cloud-service providers, where you see massive investment. The second category is wholesale, where you’re usually building a facility to lease back to one tenant, maybe two, usually supporting (AI) training. The third is enterprise, where big companies like banks want to have their own center structure. And the fourth category is colocation, which is where Equinix sits.

And what are the advantages of that? 

Think of us a little like an airport authority. It manages the runaways and the facilities of the airport and gives you the ability to rent ticketing and other kind of facilities in there. Then it manages the process of passenger engagement, so an airline comes in, like KLM, drops a passenger, and then magic happens in the background to move that passenger and their luggage to United to go on to California. We’re a little bit like the airport authority of the internet: a data package comes into Equinix and then moves on to where its next destination is. The difference between us and an airport authority is that the airport lines will compete whereas a lot of our customers colocate so they can collaborate. 

What do you do in terms of AI workloads? 

We do both training and inference. A pharmaceutical company would do their training privately at Equinix because in the pharma world much of their research and drug discovery processes have to go through private models for regulatory reasons or intellectual property protection. Training is like teaching the model and then inference really putting what the model has learned to work. 

What about the energy needs?

The different types of data centers have different characteristics when it comes to energy, who they’re starving, or how they’re supporting local economies and communities. 

We’re smack bang in the middle of what I would describe as an energy super cycle. Data centers are one component of it, but so is the electrification of everything. You have the speed of an AI meeting the pace of utilities, and it’s a headfirst collision. We don’t think it’s an insurmountable challenge but it’s going to require collaboration, innovation and time. 

How do you seeing it playing out?

Between now and 2028, it’s fair to say there is a power crunch.  Anything that we’re delivering until 2028, we understand where our power will come from. From 2028 to 2032, you’ll see an innovation click into the power landscape, in the form of data centers and data center operators looking at how they can self-generate, how they can generate on site, how they can innovate with the grid, and give power back to the grid, how they can be flexible on and off the grid.  You’ll see different aspects of innovation, including nuclear, looking at small modular reactors and how they can be utilized. 

From 2032 on, the utilities have introduced some changes. In the past, you would go to a utility and say, ‘I want this much here in this time, just-in-time power provision.’ For someone like us, which doesn’t have the same power draw as a hyperscale data center, that was usually good enough. But utilities are looking at their power framework in the form of cluster studies, taking a group of requirements together in a cluster at the same time. You define the load that you’re going to ramp up to and it will likely take the form of take or pay. If you said you’re going to use this much, you will pay for it, whether you use it or not. 

It’s important that large energy users, like data centers, pay a premium for what they’re utilizing so that we don’t impact small ratepayers, small energy users, so there’s a lot happening around collaboration. We’ve got a 27-year history of that kind of collaboration with the utilities and so we’re very involved in a number of those processes. 

Talk about the challenge of building these centers.

One is supply chain, the things that are needed to construct a data center, some of which have been subject to tariffs. In the short term, that’s not an issue but longer term, that may become something that we have to navigate our way through. And then there’s the workforce, the plumbers and mechanical engineers and welders who are maintaining our environments that keep the internet up. A lot of trade skills, construction skills and technical skills are necessary to create the data center. 

Are the centers you’re building for these workloads any larger than the ones that you built in the past? 

We do support our hyperscaler partners with the provision of data centers, through a vehicle called xScale, which is a joint venture. We have partners who fund our joint ventures, so we do participate in what I described as the wholesale economy by building what’s called a build-to-suit data center industry for a hyperscaler. So a Google would come to us and say, ‘do you guys have power and land in location X? And would you build for us?’ So we do that through a joint venture off our balance sheet because the capital-intensive nature of that is high. We own 25% of our America JV and we own 20% of our EMEA and our APAC JV. We have 15 centers that are already operational around the globe.

What do you think is underappreciated about your business model?

I think the connectivity of Equinix is underappreciated. We have 270 data centers around the world, so we’re the world’s largest independent data center operator that’s still a public company. People see the physical manifestations of those centers, but the secret sauce is the connections that sit in every single one of those data centers. They take three forms. First is the ability to interconnect a company to another company. We have the trading hubs: 72% of the world’s trading platforms operate on Equinix. You have a trading hub and all their partners located closely to them that need to be literally connected so there’s no latency between the transactions. We have 492,000 deep interconnections between the companies that operate in our centers, between value chains. 

The second piece of connectivity is to do with the clouds. They are an exceptionally important part of the technology landscape. Many customers store their data in clouds and most customers store their data in more than one cloud. They spread the love. We have a 35% market share in native cloud on ramps from our data centers. So you can pop into the cloud, get your data and bring it back.

And then the third piece is physically where we’re located. We’re not in the middle of the country. We are in cities, where human beings are with their devices. So many people refer to us as the metro edge, the city edge, the edge where people actually are. So we can connect the cloud, via the metro edge where humans are, to the far edge where devices might be utilized. 

Do you think people appreciate the role that data centers play in their lives?

In many countries, we are designated as critical infrastructure, in certain states, too, but not at the federal level. When I think about moving home: water, gas, electricity, internet becomes that fourth utility. And 95% of internet traffic runs through the Equinix environment. If you were on a Zoom call this morning, if you did a stream from any of the major providers, ordered an Uber, purchased a train ticket, you were on a platform accessing Equinix at some point. 

“95% of internet traffic runs through the Equinix environment.”Adaire Fox-Martin, CEO, Equinix

What are you seeing in terms of customer trends? 

Many of our customers are moving from the proof-of-concept phase of AI into the real-world-application phase of AI. There’s a lot to grapple with in that. It isn’t just about taking a business process and putting AI over the top of it. There are a whole series of considerations around governance and the management of data that haven’t really played into the business picture yet that are very real, especially for industries that are highly regulated. 

That’s why some have not even adopted that much AI. 

Right. Even if they are frontrunners, now it’s kind of like coming back and saying, ‘oh, how do we make sure that we’re audible, traceable, accountable, all of the things that are good governance for business. If we’re going to deploy a technology that can automate so many things and take my human out of the loop, how do I report, manage, and maintain the governance framework of those processes in my business?

We’re seeing a lot of pushback in local communities where these mega hyperscale data centers are being built.  How are you staking your claim to say we’re not that, but this is still critical infrastructure we need?

You look at it through the lens of what are the good things that a data center can do for a local community. We engage very strongly with local communities when we are beginning a construction. You do bring jobs to the area, particularly in the construction face, less so when you’re in the operation face because there isn’t a preponderance of humans across a data center.  Second, you’re obviously going to pay tax in that location and that has knock-on benefit. Thirdly, we employ and source locally. I’m very excited about our apprenticeship scheme, where young women and men who maybe didn’t have a formal education path can become data-center technicians or critical facility engineers. And when there’s a build of a data center, there’s often an upgrade of the infrastructure around it, like whether that’s the power capabilities, the roads and so on. 

Are people asking more questions about water, energy? 

For sure. And we recognize that these are extremely important parts of the life system of our planet. We were the first data center operator to begin reporting on our water usage. When you bring in power, you want to maximize the use of that energy in the deployment of workloads for customers and not just empowering the data center itself. We measure our power and how effective we are in using power. The best way to save energy to use less of it. That’s absolutely an industry standard now.

And water?

Water was never at the same level of investigation or scrutiny as power was. Now, there’s a measure of water-usage effectiveness and we were one of the first to report on that. It’s not as standardized as power and so we’re working in the industry to try and standardize that a little bit more. 

In the longer term, data centers will more than likely be cooled by liquid cooling, as opposed to air or evaporative cooling. And liquid cooling, in terms of water use, is a closed-use-loop system. You’re reusing the same water over and over again to cool the chips. The technology itself will become a determinant of sustainability. 

All the big tech companies are working to make these models smaller and more efficient. Eventually, they’re going to want to have many little data centers that are colocated. Do you think you’ll benefit from that? 

We believe the inference target addressable market, combined with the network, is about $250 billion outside of what the clouds are doing. By 2029, the inference opportunity will be twice the size of training. And that’s why we’re setting ourselves up for this opportunity. 

You can think about training as a centralized AI emotion whereas inference is very much a distributed emotion. It will initiate on a device or maybe through voice, or glass, 0r whatever the device is. And it will probably have an agent conduct its orchestra, in terms of instructing other agents to get data from more than one location. That’s why we’ve been very selective about where we built. 

You came to this job from Google almost a year and a half ago. Where are you now versus what you were thinking when you came in? 

I would say on a journey, not at the destination but heading in the right direction. I’m confident that we have such a unique combination of characteristics—the metro locations, the connectivity, the secret sauce—that we’re ready for prime time. I’m working through the dynamics of some of the negative feelings around data centers. The challenge around energy has been very real in Europe, in particular. There are countries that have just issued a moratorium on data-center builds, like Ireland, my home country, until they can kind of take a breath and understand whether they can do. These problems are absolutely addressable. They’re absolutely surmountable. It’s a time-based issue that’s going to require collaboration and innovation to solve. 

What about the regulatory environment? That’s been in flux.

There is a lot of noise on a variety of topics. I’m just working to control the controllable, and carry on the path that we believe for us is the right path. For example, Equinix has some goals around our sustainability narrative. By 2030, we set a goal for ourselves that we would be neutral as it relates to the use of carbon. We’re still on that track. And we’ve set a science-based goal for 2040 to be net zero and we will continue to innovate and work to do that. 

It’s not just that we believe there is an opportunity for technology and innovation to exist with good environmental stewardship. Our customers are continuing to ask us for reports on how their usage at Equinix is impacting things that we may be measure.

There’s a lot of what about AI. What will it do? But there’s a where about AI. And we’re like the where of AI. There are physical cables, even under the ocean, and cable trays and billions of wires. If you’re in California, you get to see the history of data centers. The internet will literally be above your head. We have three decades of data center history, from our very first one to our latest one. I never thought I would come into a company where we have 56 active construction projects all around the world.



Source link

Continue Reading

Business

The rise of AI reasoning models comes with a big energy tradeoff

Published

on



Nearly all leading artificial intelligence developers are focused on building AI models that mimic the way humans reason, but new research shows these cutting-edge systems can be far more energy intensive, adding to concerns about AI’s strain on power grids.

AI reasoning models used 30 times more power on average to respond to 1,000 written prompts than alternatives without this reasoning capability or which had it disabled, according to a study released Thursday. The work was carried out by the AI Energy Score project, led by Hugging Face research scientist Sasha Luccioni and Salesforce Inc. head of AI sustainability Boris Gamazaychikov.

The researchers evaluated 40 open, freely available AI models, including software from OpenAI, Alphabet Inc.’s Google and Microsoft Corp. Some models were found to have a much wider disparity in energy consumption, including one from Chinese upstart DeepSeek. A slimmed-down version of DeepSeek’s R1 model used just 50 watt hours to respond to the prompts when reasoning was turned off, or about as much power as is needed to run a 50 watt lightbulb for an hour. With the reasoning feature enabled, the same model required 7,626 watt hours to complete the tasks.

The soaring energy needs of AI have increasingly come under scrutiny. As tech companies race to build more and bigger data centers to support AI, industry watchers have raised concerns about straining power grids and raising energy costs for consumers. A Bloomberg investigation in September found that wholesale electricity prices rose as much as 267% over the past five years in areas near data centers. There are also environmental drawbacks, as Microsoft, Google and Amazon.com Inc. have previously acknowledged the data center buildout could complicate their long-term climate objectives

More than a year ago, OpenAI released its first reasoning model, called o1. Where its prior software replied almost instantly to queries, o1 spent more time computing an answer before responding. Many other AI companies have since released similar systems, with the goal of solving more complex multistep problems for fields like science, math and coding.

Though reasoning systems have quickly become the industry norm for carrying out more complicated tasks, there has been little research into their energy demands. Much of the increase in power consumption is due to reasoning models generating much more text when responding, the researchers said. 

The new report aims to better understand how AI energy needs are evolving, Luccioni said. She also hopes it helps people better understand that there are different types of AI models suited to different actions. Not every query requires tapping the most computationally intensive AI reasoning systems.

“We should be smarter about the way that we use AI,” Luccioni said. “Choosing the right model for the right task is important.”

To test the difference in power use, the researchers ran all the models on the same computer hardware. They used the same prompts for each, ranging from simple questions — such as asking which team won the Super Bowl in a particular year — to more complex math problems. They also used a software tool called CodeCarbon to track how much energy was being consumed in real time.

The results varied considerably. The researchers found one of Microsoft’s Phi 4 reasoning models used 9,462 watt hours with reasoning turned on, compared with about 18 watt hours with it off. OpenAI’s largest gpt-oss model, meanwhile, had a less stark difference. It used 8,504 watt hours with reasoning on the most computationally intensive “high” setting and 5,313 watt hours with the setting turned down to “low.” 

OpenAI, Microsoft, Google and DeepSeek did not immediately respond to a request for comment.

Google released internal research in August that estimated the median text prompt for its Gemini AI service used 0.24 watt-hours of energy, roughly equal to watching TV for less than nine seconds. Google said that figure was “substantially lower than many public estimates.” 

Much of the discussion about AI power consumption has focused on large-scale facilities set up to train artificial intelligence systems. Increasingly, however, tech firms are shifting more resources to inference, or the process of running AI systems after they’ve been trained. The push toward reasoning models is a big piece of that as these systems are more reliant on inference.

Recently, some tech leaders have acknowledged that AI’s power draw needs to be reckoned with. Microsoft CEO Satya Nadella said the industry must earn the “social permission to consume energy” for AI data centers in a November interview. To do that, he argued tech must use AI to do good and foster broad economic growth.



Source link

Continue Reading

Business

SpaceX to offer insider shares at record-setting valuation

Published

on



SpaceX is preparing to sell insider shares in a transaction that would value Elon Musk’s rocket and satellite maker at a valuation higher than OpenAI’s record-setting $500 billion, people familiar with the matter said.

One of the people briefed on the deal said that the share price under discussion is higher than $400 apiece, which would value SpaceX at between $750 billion and $800 billion, though the details could change. 

The company’s latest tender offer was discussed by its board of directors on Thursday at SpaceX’s Starbase hub in Texas. If confirmed, it would make SpaceX once again the world’s most valuable closely held company, vaulting past the previous record of $500 billion that ChatGPT owner OpenAI set in October. Play Video

Preliminary scenarios included per-share prices that would have pushed SpaceX’s value at roughly $560 billion or higher, the people said. The details of the deal could change before it closes, a third person said. 

A representative for SpaceX didn’t immediately respond to a request for comment. 

The latest figure would be a substantial increase from the $212 a share set in July, when the company raised money and sold shares at a valuation of $400 billion.

The Wall Street Journal and Financial Times, citing unnamed people familiar with the matter, earlier reported that a deal would value SpaceX at $800 billion.

News of SpaceX’s valuation sent shares of EchoStar Corp., a satellite TV and wireless company, up as much as 18%. Last month, Echostar had agreed to sell spectrum licenses to SpaceX for $2.6 billion, adding to an earlier agreement to sell about $17 billion in wireless spectrum to Musk’s company.

Subscribe Now: The Business of Space newsletter covers NASA, key industry events and trends.

The world’s most prolific rocket launcher, SpaceX dominates the space industry with its Falcon 9 rocket that launches satellites and people to orbit.

SpaceX is also the industry leader in providing internet services from low-Earth orbit through Starlink, a system of more than 9,000 satellites that is far ahead of competitors including Amazon.com Inc.’s Amazon Leo.

SpaceX executives have repeatedly floated the idea of spinning off SpaceX’s Starlink business into a separate, publicly traded company — a concept President Gwynne Shotwell first suggested in 2020. 

However, Musk cast doubt on the prospect publicly over the years and Chief Financial Officer Bret Johnsen said in 2024 that a Starlink IPO would be something that would take place more likely “in the years to come.”

The Information, citing people familiar with the discussions, separately reported on Friday that SpaceX has told investors and financial institution representatives that it is aiming for an initial public offering for the entire company in the second half of next year.

A so-called tender or secondary offering, through which employees and some early shareholders can sell shares, provides investors in closely held companies such as SpaceX a way to generate liquidity.

SpaceX is working to develop its new Starship vehicle, advertised as the most powerful rocket ever developed to loft huge numbers of Starlink satellites as well as carry cargo and people to moon and, eventually, Mars.



Source link

Continue Reading

Business

U.S. consumers are so strained they put more than $1B on BNPL during Black Friday and Cyber Monday

Published

on



Financially strained and cautious customers leaned heavily on buy now, pay later (BNPL) services over the holiday weekend.

Cyber Monday alone generated $1.03 billion (a 4.2% increase YoY) in online BNPL sales with most transactions happening on mobile devices, per Adobe Analytics. Overall, consumers spent $14.25 billion online on Cyber Monday. To put that into perspective, BNPL made up for more than 7.2% of total online sales on that day.

As for Black Friday, eMarketer reported $747.5 million in online sales using BNPL services with platforms like PayPal finding a 23% uptick in BNPL transactions.

Likewise, digital financial services company Zip reported 1.6 million transactions throughout 280,000 of its locations over the Black Friday and Cyber Monday weekend. Millennials (51%) accounted for a chunk of the sizable BNPL purchases, followed by Gen Z, Gen X, and baby boomers, per Zip.

The Adobe data showed that people using BNPL were most likely to spend on categories such as electronics, apparel, toys, and furniture, which is consistent with previous years. This trend also tracks with Zip’s findings that shoppers were primarily investing in tech, electronics, and fashion when using its services.

And while some may be surprised that shoppers are taking on more debt via BNPL (in this economy?!), analysts had already projected a strong shopping weekend. A Deloitte survey forecast that consumers would spend about $650 million over the Black Friday–Cyber Monday stretch—a 15% jump from 2023.

“US retailers leaned heavily on discounts this holiday season to drive online demand,” Vivek Pandya, lead analyst at Adobe Digital Insights, said in a statement. “Competitive and persistent deals throughout Cyber Week pushed consumers to shop earlier, creating an environment where Black Friday now challenges the dominance of Cyber Monday.”

This report was originally published by Retail Brew.



Source link

Continue Reading

Trending

Copyright © Miami Select.