Connect with us

Business

Nvidia’s China-based rival posts 4,300% revenue jump as chipmakers’ earnings reported no H20 chip sales to the country

Published

on



Cambricon, a China-based semiconductor firm, posted record profit in the first half of the year, along with revenue that surged roughly 4,300%.

The earnings, released late Tuesday, serve as an example of Nvidia’s growing local competition in China, as the government and market seek alternative chipmakers to gain traction in the region. Nvidia’s business in China has been tied up in U.S. export restrictions and geopolitical tensions, and the tech behemoth recorded no H20 chip sales to China in the second quarter, per its earnings release yesterday.

Cambricon’s first-half revenue surged to 2.88 billion Chinese yuan ($402.7 million), the company reported this week. The Chinese upstart, created by two “genius brothers,” is partially state-owned and headquartered in Beijing. The company’s stock is now China’s most expensive, overtaking liquor company Kweichow Moutai. Still, the whopping revenue growth is a far cry from Nvidia, which reported $46.7 billion in the second quarter alone.

But experts tell Fortune Cambricon’s growth reflects a larger push to create local Nvidia rivals in China—especially as the tech giant deals with increased export restrictions under the Trump administration.

“Nvidia apparently has a better overall offerings in terms of the hardware in China, but because of the export controls, right now they cannot sell, basically, to China,” Ray Wang, research director of semiconductors, supply chain, and emerging tech at The Futurum Group, told Fortune. “They leave a big market void for a Chinese competitor to fulfill.”

Wang said large China tech companies like Huawei and SMIC are “catching up rapidly” to Nvidia in terms of both product and quality, as well as production capacity.

“That’s a serious concern for both Nvidia and the U.S. government’s agenda in terms of… dominating AI globally,” he said.

Export tensions with China

Earlier this year, the U.S. enforced stricter export controls to China, at one point banning H20 chips—which are known to be less powerful than Nvidia’s AI chips—from being sold to the country. In July, the ban was lifted, but it also allowed time for companies to invest in innovation.

“The problem with banning [H20 chips] is you’re effectively handing the AI market and training over to companies like Huawei or Cambricon or… other local players,” Stacy Rasgon, senior analyst of U.S. semiconductors and semiconductor capital equipment at Bernstein Research, told Fortune.

Rasgon pointed out that, in Cambricon’s case, the roughly 44-fold revenue increase to $402.7 million in the first half of this year means the company went from “tiny to small.” He said he’s less focused on the percentage growth than the reason behind it.

“There’s a big push in China for self-sufficiency,” Rasgon said.

Cambricon’s record profit was helped by a wave of demand for Chinese chips after Beijing encouraged using local technology, citing security concerns and uncertainty over the Trump administration’s export curbs. The most recent catalyst for Cambricon’s surge came from AI startup DeepSeek, which said last week its latest model comes with a feature that can optimize locally-made chips.

Last week, the Chinese government told its tech companies to stop using Nvidia’s H20 chips after U.S. Commerce Secretary Howard Lutnick said China would only receive the company’s “fourth best” chip when speaking with CNBC, only adding fuel to the fire.

“You want to sell the Chinese enough that they get addicted to the American technology stack,” Lutnick added.

Despite technology advancements by Nvidia rivals amid geopolitical tensions, demand for its H20 chips remain—even in the face of regulatory hurdles the company is navigating.

In its second-quarter earnings, Nvidia reported no H20 sales to China-based customers. In its earnings call on Wednesday, Chief Financial Officer Colette Kress estimated $2 billion to $5 billion in H20 revenue this quarter should “geopolitical issues reside.” Nvidia did not include any revenue from H20 chips in its third quarter guidance, which tops analysts’ expectations of $53.14 billion at $54 billion, plus or minus 2%.

“It was inevitable there would be more entrants into this market,” Sebastien Naji, a research analyst at William Blair, told Fortune. “Near-term, I think the risks on the regulatory front are more impactful than increased competition.”

Nvidia previously warned that if not for the U.S. chip export restrictions, its topline guidance for the July quarter would have been $8 billion higher.

“I think the stock does not have that priced in, in terms of if that revenue were to go away,” Scott Bickley, an advisory fellow at Info-Tech Research Group, told Fortune before Nvidia’s earnings call on Wednesday.

CFO Kress also said during the earnings call over the past few weeks, a “select number” of China-based customers received licenses for H20 chips, though none have been shipped based on those licenses. Kress also mentioned the U.S. government and Nvidia haven’t finalized a recent agreement that will require the chipmaker to share 15% of the revenue it makes through H20 chip sales to China.

How China’s chips stack up to Nvidia’s

There are already some Chinese products that outperform Nvidia’s H20, analyst Rasgon said. He said he expects greater competition in the local market to only catalyze chip innovation in China.

“Nvidia is never going to be allowed, probably, to sell better parts in China,” Rasgon said. “So for the Chinese, it takes time, but they’re going to work on improving their own stuff. And over time, maybe that gap closes.”

Nvidia CEO Jensen Huang has long complained about U.S. export controls, saying they will only galvanize local players to innovate in the chipmaker’s absence. 

“The China market, I’ve estimated to be about $50 billion of opportunity for us this year if we were able to address it with competitive products,” Huang said during the second-quarter earnings call.

But not only does Nvidia look to resume H20 chip sales in China, the company also wants to expand its product line by introducing the high-performance Blackwell chip in the country, should the U.S. agree. 

“We continue to advocate for the U.S. government to approve Blackwell for China,” Kress said during the earnings call. The company aims to “win the support of every developer” in highly-competitive markets, she added, so Nvidia technology can be the world’s gold-standard.

“You kind of need a Blackwell chip [in China], even though it’s going to be performance-laden in nature, relative to everything else in the market,” Angelo Zino, SVP and technology equity analyst at CFRA, told Fortune.

While Zino said the H20 “probably isn’t going to give you enough to offset or get back the revenue” the company had a couple of quarters ago, introducing a Blackwell chip in China just might.



Source link

Continue Reading

Business

The rise of AI reasoning models comes with a big energy tradeoff

Published

on



Nearly all leading artificial intelligence developers are focused on building AI models that mimic the way humans reason, but new research shows these cutting-edge systems can be far more energy intensive, adding to concerns about AI’s strain on power grids.

AI reasoning models used 30 times more power on average to respond to 1,000 written prompts than alternatives without this reasoning capability or which had it disabled, according to a study released Thursday. The work was carried out by the AI Energy Score project, led by Hugging Face research scientist Sasha Luccioni and Salesforce Inc. head of AI sustainability Boris Gamazaychikov.

The researchers evaluated 40 open, freely available AI models, including software from OpenAI, Alphabet Inc.’s Google and Microsoft Corp. Some models were found to have a much wider disparity in energy consumption, including one from Chinese upstart DeepSeek. A slimmed-down version of DeepSeek’s R1 model used just 50 watt hours to respond to the prompts when reasoning was turned off, or about as much power as is needed to run a 50 watt lightbulb for an hour. With the reasoning feature enabled, the same model required 7,626 watt hours to complete the tasks.

The soaring energy needs of AI have increasingly come under scrutiny. As tech companies race to build more and bigger data centers to support AI, industry watchers have raised concerns about straining power grids and raising energy costs for consumers. A Bloomberg investigation in September found that wholesale electricity prices rose as much as 267% over the past five years in areas near data centers. There are also environmental drawbacks, as Microsoft, Google and Amazon.com Inc. have previously acknowledged the data center buildout could complicate their long-term climate objectives

More than a year ago, OpenAI released its first reasoning model, called o1. Where its prior software replied almost instantly to queries, o1 spent more time computing an answer before responding. Many other AI companies have since released similar systems, with the goal of solving more complex multistep problems for fields like science, math and coding.

Though reasoning systems have quickly become the industry norm for carrying out more complicated tasks, there has been little research into their energy demands. Much of the increase in power consumption is due to reasoning models generating much more text when responding, the researchers said. 

The new report aims to better understand how AI energy needs are evolving, Luccioni said. She also hopes it helps people better understand that there are different types of AI models suited to different actions. Not every query requires tapping the most computationally intensive AI reasoning systems.

“We should be smarter about the way that we use AI,” Luccioni said. “Choosing the right model for the right task is important.”

To test the difference in power use, the researchers ran all the models on the same computer hardware. They used the same prompts for each, ranging from simple questions — such as asking which team won the Super Bowl in a particular year — to more complex math problems. They also used a software tool called CodeCarbon to track how much energy was being consumed in real time.

The results varied considerably. The researchers found one of Microsoft’s Phi 4 reasoning models used 9,462 watt hours with reasoning turned on, compared with about 18 watt hours with it off. OpenAI’s largest gpt-oss model, meanwhile, had a less stark difference. It used 8,504 watt hours with reasoning on the most computationally intensive “high” setting and 5,313 watt hours with the setting turned down to “low.” 

OpenAI, Microsoft, Google and DeepSeek did not immediately respond to a request for comment.

Google released internal research in August that estimated the median text prompt for its Gemini AI service used 0.24 watt-hours of energy, roughly equal to watching TV for less than nine seconds. Google said that figure was “substantially lower than many public estimates.” 

Much of the discussion about AI power consumption has focused on large-scale facilities set up to train artificial intelligence systems. Increasingly, however, tech firms are shifting more resources to inference, or the process of running AI systems after they’ve been trained. The push toward reasoning models is a big piece of that as these systems are more reliant on inference.

Recently, some tech leaders have acknowledged that AI’s power draw needs to be reckoned with. Microsoft CEO Satya Nadella said the industry must earn the “social permission to consume energy” for AI data centers in a November interview. To do that, he argued tech must use AI to do good and foster broad economic growth.



Source link

Continue Reading

Business

SpaceX to offer insider shares at record-setting valuation

Published

on



SpaceX is preparing to sell insider shares in a transaction that would value Elon Musk’s rocket and satellite maker at a valuation higher than OpenAI’s record-setting $500 billion, people familiar with the matter said.

One of the people briefed on the deal said that the share price under discussion is higher than $400 apiece, which would value SpaceX at between $750 billion and $800 billion, though the details could change. 

The company’s latest tender offer was discussed by its board of directors on Thursday at SpaceX’s Starbase hub in Texas. If confirmed, it would make SpaceX once again the world’s most valuable closely held company, vaulting past the previous record of $500 billion that ChatGPT owner OpenAI set in October. Play Video

Preliminary scenarios included per-share prices that would have pushed SpaceX’s value at roughly $560 billion or higher, the people said. The details of the deal could change before it closes, a third person said. 

A representative for SpaceX didn’t immediately respond to a request for comment. 

The latest figure would be a substantial increase from the $212 a share set in July, when the company raised money and sold shares at a valuation of $400 billion.

The Wall Street Journal and Financial Times, citing unnamed people familiar with the matter, earlier reported that a deal would value SpaceX at $800 billion.

News of SpaceX’s valuation sent shares of EchoStar Corp., a satellite TV and wireless company, up as much as 18%. Last month, Echostar had agreed to sell spectrum licenses to SpaceX for $2.6 billion, adding to an earlier agreement to sell about $17 billion in wireless spectrum to Musk’s company.

Subscribe Now: The Business of Space newsletter covers NASA, key industry events and trends.

The world’s most prolific rocket launcher, SpaceX dominates the space industry with its Falcon 9 rocket that launches satellites and people to orbit.

SpaceX is also the industry leader in providing internet services from low-Earth orbit through Starlink, a system of more than 9,000 satellites that is far ahead of competitors including Amazon.com Inc.’s Amazon Leo.

SpaceX executives have repeatedly floated the idea of spinning off SpaceX’s Starlink business into a separate, publicly traded company — a concept President Gwynne Shotwell first suggested in 2020. 

However, Musk cast doubt on the prospect publicly over the years and Chief Financial Officer Bret Johnsen said in 2024 that a Starlink IPO would be something that would take place more likely “in the years to come.”

The Information, citing people familiar with the discussions, separately reported on Friday that SpaceX has told investors and financial institution representatives that it is aiming for an initial public offering for the entire company in the second half of next year.

A so-called tender or secondary offering, through which employees and some early shareholders can sell shares, provides investors in closely held companies such as SpaceX a way to generate liquidity.

SpaceX is working to develop its new Starship vehicle, advertised as the most powerful rocket ever developed to loft huge numbers of Starlink satellites as well as carry cargo and people to moon and, eventually, Mars.



Source link

Continue Reading

Business

U.S. consumers are so strained they put more than $1B on BNPL during Black Friday and Cyber Monday

Published

on



Financially strained and cautious customers leaned heavily on buy now, pay later (BNPL) services over the holiday weekend.

Cyber Monday alone generated $1.03 billion (a 4.2% increase YoY) in online BNPL sales with most transactions happening on mobile devices, per Adobe Analytics. Overall, consumers spent $14.25 billion online on Cyber Monday. To put that into perspective, BNPL made up for more than 7.2% of total online sales on that day.

As for Black Friday, eMarketer reported $747.5 million in online sales using BNPL services with platforms like PayPal finding a 23% uptick in BNPL transactions.

Likewise, digital financial services company Zip reported 1.6 million transactions throughout 280,000 of its locations over the Black Friday and Cyber Monday weekend. Millennials (51%) accounted for a chunk of the sizable BNPL purchases, followed by Gen Z, Gen X, and baby boomers, per Zip.

The Adobe data showed that people using BNPL were most likely to spend on categories such as electronics, apparel, toys, and furniture, which is consistent with previous years. This trend also tracks with Zip’s findings that shoppers were primarily investing in tech, electronics, and fashion when using its services.

And while some may be surprised that shoppers are taking on more debt via BNPL (in this economy?!), analysts had already projected a strong shopping weekend. A Deloitte survey forecast that consumers would spend about $650 million over the Black Friday–Cyber Monday stretch—a 15% jump from 2023.

“US retailers leaned heavily on discounts this holiday season to drive online demand,” Vivek Pandya, lead analyst at Adobe Digital Insights, said in a statement. “Competitive and persistent deals throughout Cyber Week pushed consumers to shop earlier, creating an environment where Black Friday now challenges the dominance of Cyber Monday.”

This report was originally published by Retail Brew.



Source link

Continue Reading

Trending

Copyright © Miami Select.