Nvidia CEO Says Power-Saving Optical Chip Tech Will Need to Wait for Wider Use 

The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)
The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)
TT

Nvidia CEO Says Power-Saving Optical Chip Tech Will Need to Wait for Wider Use 

The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)
The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)

A promising new chip technology that aims to cut energy usage is not yet reliable enough for use in Nvidia's flagship graphics processing units (GPUs), Nvidia's CEO Jensen Huang said Tuesday.

Co-packaged optics, as the emerging technology is called, uses beams of laser light to send information on fiber optic cables between chips, making connections faster and with superior energy efficiency to those through traditional copper cables.

During a keynote address to Nvidia's annual developer conference at a packed hockey stadium in San Jose, California on Tuesday, Huang said his company would use the co-packaged optical technology in two new networking chips that sit in switches on top of its servers, saying the technology would make the chips three and a half times more energy efficient than their predecessors.

The switch chips will come out later this year and into 2026 in a small but significant step toward advancing the technology.

But Huang told a group of journalists after his speech that while Nvidia examined using it more widely in its flagship GPU chips it had no current plans to do so, because traditional copper connections were "orders of magnitude" more reliable than today's co-packaged optical connections.

"That's not worth it," Huang said of using optical connections directly between GPUs. "We keep playing with that equation. Copper is far better."

Huang said that he was focused on providing a reliable product roadmap that Nvidia's customers, such as OpenAI and Oracle, could prepare for.

"In a couple years, several hundred billion dollars of AI infrastructure is going to get laid down, and so you've got the budget approved. You got the power approved. You got the land built," Huang said. "What are you willing to scale up to several hundred billion dollars right now?"

Silicon Valley entrepreneurs and investors have pinned their hopes on the optics technology, which they believe will be central to building ever-larger computers for AI systems, which Huang said on Tuesday would still be necessary even after advances by companies like DeepSeek because AI systems would need more computing power to think through their answers.

Startups such as Ayar Labs, Lightmatter and Celestial AI have raised hundreds of millions of dollars in venture capital - some of it from Nvidia itself - to try and put co-packaged optical connections directly onto AI chips. Lightmatter and Celestial AI are both targeting public offerings.

Copper connections are cheap and fast, but can only carry data a few meters at most. While that might seem trivial, it has had a huge impact on Nvidia's product lineup over the past half decade.

Nvidia's current flagship product contains 72 of its chips in a single server, consuming 120 kilowatts of electricity and generating so much heat that it requires a liquid cooling system similar to that of a car engine. The flagship server unveiled on Tuesday for release in 2027 will pack hundreds of its Vera Rubin Ultra Chips into a single rack and will consume 600 kilowatts of power.

Cramming more than double the number of chips into the same space over two years will require massive feats of engineering from Nvidia and its partners. Those feats are driven by the fact that AI computing work requires moving a lot of data back and forth between chips, and Nvidia is trying to keep as many chips as it can within the relatively short reach of copper connections.

Mark Wade, the CEO of Ayar Labs, which has received venture backing from Nvidia, said the chip industry was still navigating how to manufacture co-packaged optics at lower costs and with higher reliability. While the transition may not come until 2028 or beyond, Wade said, the chip industry will have little choice but to ditch copper if it wants to keep building bigger and bigger servers.

"Just look at the power consumption going up and up on racks with electrical connections," Wade told Reuters in an interview on the sidelines of Nvidia's conference. "Optics is the only technology that gets you off of that train."



Nvidia Boss Insists 'Huge' Investment in OpenAI on Track

Nvidia CEO Jensen Huang insists the US tech giant is going to make 'a huge investment in OpenAI'. Patrick T. Fallon / AFP/File
Nvidia CEO Jensen Huang insists the US tech giant is going to make 'a huge investment in OpenAI'. Patrick T. Fallon / AFP/File
TT

Nvidia Boss Insists 'Huge' Investment in OpenAI on Track

Nvidia CEO Jensen Huang insists the US tech giant is going to make 'a huge investment in OpenAI'. Patrick T. Fallon / AFP/File
Nvidia CEO Jensen Huang insists the US tech giant is going to make 'a huge investment in OpenAI'. Patrick T. Fallon / AFP/File

Nvidia chief executive Jensen Huang has insisted the US tech giant will make a "huge" investment in OpenAI and dismissed as "nonsense" reports that he is unhappy with the generative AI star.

Huang made the remarks late Saturday in Taipei after the Wall Street Journal reported that Nvidia's plan to invest up to $100 billion in OpenAI had been put on ice, said AFP.

Nvidia announced the plan in September to invest $100 billion in OpenAI, building infrastructure for next-generation artificial intelligence.

The Wall Street Journal, citing unnamed sources, said some people inside Nvidia had expressed doubts about the deal and that the two sides were rethinking the partnership.

"That's complete nonsense. We are going to make a huge investment in OpenAI," Huang told journalists, when asked about reports that he was unhappy with OpenAI.

Huang insisted that Nvidia was going ahead with its investment in OpenAI, describing it as "one of the most consequential companies of our time".

"Sam is closing the round, and we will absolutely be involved in the round," Huang said, referring to OpenAI chief executive Sam Altman.

"We will invest a great deal of money, probably the largest investment we've ever made."

Nvidia has come to dominate spending on the processors needed for training and operating the large language models (LLM) behind chatbots like OpenAI's ChatGPT or Google Gemini.

Sales of its graphics processing units (GPUs) -- originally developed for 3D gaming -- powered the company's market cap to over $5 trillion in October, although the figure has since fallen back by more than $600 billion.

LLM developers like OpenAI are directing much of the mammoth investment they have received into Nvidia's products, rushing to build GPU-stuffed data centers to serve an anticipated flood of demand for AI services.


Meta Shares Skyrocket, Microsoft Slides on Wall Street after Earnings

A Microsoft logo is seen a day after Microsoft Corp's $26.2 billion purchase of LinkedIn Corp, in Los Angeles, California, US, June 14, 2016. REUTERS/Lucy Nicholson
A Microsoft logo is seen a day after Microsoft Corp's $26.2 billion purchase of LinkedIn Corp, in Los Angeles, California, US, June 14, 2016. REUTERS/Lucy Nicholson
TT

Meta Shares Skyrocket, Microsoft Slides on Wall Street after Earnings

A Microsoft logo is seen a day after Microsoft Corp's $26.2 billion purchase of LinkedIn Corp, in Los Angeles, California, US, June 14, 2016. REUTERS/Lucy Nicholson
A Microsoft logo is seen a day after Microsoft Corp's $26.2 billion purchase of LinkedIn Corp, in Los Angeles, California, US, June 14, 2016. REUTERS/Lucy Nicholson

Shares in Meta skyrocketed by 10 percent at opening on Wall Street on Thursday, a day after the social media giant posted better than expected earnings as the company invests heavily in artificial intelligence.

Microsoft, whose earnings disappointed analysts, saw its share price tumble by 10 percent, with investors showing concern for the return on investment for the software giant's spending on AI.


Samsung Logs Best-ever Profit on AI Chip Demand

South Korean tech giant Samsung Electronics posted record quarterly profits on Thursday, riding strong market demand for its artificial intelligence chips. Jung Yeon-je / AFP/File
South Korean tech giant Samsung Electronics posted record quarterly profits on Thursday, riding strong market demand for its artificial intelligence chips. Jung Yeon-je / AFP/File
TT

Samsung Logs Best-ever Profit on AI Chip Demand

South Korean tech giant Samsung Electronics posted record quarterly profits on Thursday, riding strong market demand for its artificial intelligence chips. Jung Yeon-je / AFP/File
South Korean tech giant Samsung Electronics posted record quarterly profits on Thursday, riding strong market demand for its artificial intelligence chips. Jung Yeon-je / AFP/File

South Korean tech giant Samsung Electronics posted record quarterly profits Thursday, riding massive market demand for the memory chips that power artificial intelligence.

A global frenzy to build AI data centers and develop the fast-evolving technology has sent orders for advanced high bandwidth memory microchips soaring.

That is also pushing up prices for less flashy chips used in consumer electronics -- threatening higher prices for phones, laptops and other devices worldwide.

In the quarter to December 2025, Samsung said it saw "its highest-ever quarterly consolidated revenue at KRW 93.8 trillion (US$65.5 billion)", a quarter-on-quarter increase of nine percent.

"Operating profit was also an all-time high, at KRW 20.1 trillion," the company said.

The dazzling earnings came a day after a key competitor, South Korean chip giant SK hynix, said operating profit had doubled last year to a record high, also buoyed by the AI boom.

The South Korean government has pledged to become one of the top three AI powers, behind the United States and China, with Samsung and SK hynix among the leading producers of high-performance memory.

Samsung said Thursday it expects "AI and server demand to continue increasing, leading to more opportunities for structural growth".

Annual revenue stood at 333.6 trillion won, while operating profit came in at 43.6 trillion won. Sales for the division that oversees its semiconductor business rose 33 percent quarter-on-quarter.

The company pointed to a $33.2 billion investment in chip production facilities -- pledging to continue spending in "transitioning to advanced manufacturing processes and upgrading existing production lines to meet rising demand".

- 'Clearly back' -

Major electronics manufacturers and industry analysts have warned that chipmakers focusing on AI sales will cause higher retail prices for consumer products across the board.

This week US chip firm Micron said it was building a $24 billion plant in Singapore in response to AI-driven demand that has caused a global shortage of memory components.

SK hynix announced Wednesday that its operating profit had doubled last year to a record 47.2 trillion won.

The company's shares have surged some 220 percent over the past six months, while Samsung Electronics has risen about 130 percent, part of a huge global tech rally fueled by optimism over AI.

Both companies are on the cusp of producing next-generation high-bandwidth "HBM4" chips for AI data centers, with Samsung reportedly due to start making them in February.

American chip giant Nvidia -- now the world's most valuable company -- is expected to be one of Samsung's customers for HBM4 chips.

But Nvidia has reportedly allocated around 70 percent of its HBM4 demand to SK hynix for 2026, up from the market's previous estimate of 50 percent.

"Samsung is clearly back and we are expecting them to show a significant turnaround with HBM4 for Nvidia's new products -- helping them move past last year's quality issues," Hwang Min-seong, research director at market analysis firm Counterpoint, told AFP.

But SK still "maintains a market lead in both quality and supply" of a number of key components, including Dynamic Random Access Memory chips used in AI servers, he said.

SK also this week said it will set up an "AI solutions firm" in the United States, committing $10 billion and weighing investments in US companies.