Nvidia CEO Says Power-Saving Optical Chip Tech Will Need to Wait for Wider Use 

The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)
The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)
TT
20

Nvidia CEO Says Power-Saving Optical Chip Tech Will Need to Wait for Wider Use 

The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)
The stage is seen after a keynote session at the SAP Center in San Jose, California, on March 18, 2025. (AFP)

A promising new chip technology that aims to cut energy usage is not yet reliable enough for use in Nvidia's flagship graphics processing units (GPUs), Nvidia's CEO Jensen Huang said Tuesday.

Co-packaged optics, as the emerging technology is called, uses beams of laser light to send information on fiber optic cables between chips, making connections faster and with superior energy efficiency to those through traditional copper cables.

During a keynote address to Nvidia's annual developer conference at a packed hockey stadium in San Jose, California on Tuesday, Huang said his company would use the co-packaged optical technology in two new networking chips that sit in switches on top of its servers, saying the technology would make the chips three and a half times more energy efficient than their predecessors.

The switch chips will come out later this year and into 2026 in a small but significant step toward advancing the technology.

But Huang told a group of journalists after his speech that while Nvidia examined using it more widely in its flagship GPU chips it had no current plans to do so, because traditional copper connections were "orders of magnitude" more reliable than today's co-packaged optical connections.

"That's not worth it," Huang said of using optical connections directly between GPUs. "We keep playing with that equation. Copper is far better."

Huang said that he was focused on providing a reliable product roadmap that Nvidia's customers, such as OpenAI and Oracle, could prepare for.

"In a couple years, several hundred billion dollars of AI infrastructure is going to get laid down, and so you've got the budget approved. You got the power approved. You got the land built," Huang said. "What are you willing to scale up to several hundred billion dollars right now?"

Silicon Valley entrepreneurs and investors have pinned their hopes on the optics technology, which they believe will be central to building ever-larger computers for AI systems, which Huang said on Tuesday would still be necessary even after advances by companies like DeepSeek because AI systems would need more computing power to think through their answers.

Startups such as Ayar Labs, Lightmatter and Celestial AI have raised hundreds of millions of dollars in venture capital - some of it from Nvidia itself - to try and put co-packaged optical connections directly onto AI chips. Lightmatter and Celestial AI are both targeting public offerings.

Copper connections are cheap and fast, but can only carry data a few meters at most. While that might seem trivial, it has had a huge impact on Nvidia's product lineup over the past half decade.

Nvidia's current flagship product contains 72 of its chips in a single server, consuming 120 kilowatts of electricity and generating so much heat that it requires a liquid cooling system similar to that of a car engine. The flagship server unveiled on Tuesday for release in 2027 will pack hundreds of its Vera Rubin Ultra Chips into a single rack and will consume 600 kilowatts of power.

Cramming more than double the number of chips into the same space over two years will require massive feats of engineering from Nvidia and its partners. Those feats are driven by the fact that AI computing work requires moving a lot of data back and forth between chips, and Nvidia is trying to keep as many chips as it can within the relatively short reach of copper connections.

Mark Wade, the CEO of Ayar Labs, which has received venture backing from Nvidia, said the chip industry was still navigating how to manufacture co-packaged optics at lower costs and with higher reliability. While the transition may not come until 2028 or beyond, Wade said, the chip industry will have little choice but to ditch copper if it wants to keep building bigger and bigger servers.

"Just look at the power consumption going up and up on racks with electrical connections," Wade told Reuters in an interview on the sidelines of Nvidia's conference. "Optics is the only technology that gets you off of that train."



Reddit Sues AI Giant Anthropic Over Content Use

Dario Amodei, co-founder and CEO of Anthropic. JULIEN DE ROSA / AFP
Dario Amodei, co-founder and CEO of Anthropic. JULIEN DE ROSA / AFP
TT
20

Reddit Sues AI Giant Anthropic Over Content Use

Dario Amodei, co-founder and CEO of Anthropic. JULIEN DE ROSA / AFP
Dario Amodei, co-founder and CEO of Anthropic. JULIEN DE ROSA / AFP

Social media outlet Reddit filed a lawsuit Wednesday against artificial intelligence company Anthropic, accusing the startup of illegally scraping millions of user comments to train its Claude chatbot without permission or compensation.

The lawsuit in a California state court represents the latest front in the growing battle between content providers and AI companies over the use of data to train increasingly sophisticated language models that power the generative AI revolution.

Anthropic, valued at $61.5 billion and heavily backed by Amazon, was founded in 2021 by former executives from OpenAI, the creator of ChatGPT.

The company, known for its Claude chatbot and AI models, positions itself as focused on AI safety and responsible development.

"This case is about the two faces of Anthropic: the public face that attempts to ingratiate itself into the consumer's consciousness with claims of righteousness and respect for boundaries and the law, and the private face that ignores any rules that interfere with its attempts to further line its pockets," the suit said.

According to the complaint, Anthropic has been training its models on Reddit content since at least December 2021, with CEO Dario Amodei co-authoring research papers that specifically identified high-quality content for data training.

The lawsuit alleges that despite Anthropic's public claims that it had blocked its bots from accessing Reddit, the company's automated systems continued to harvest Reddit's servers more than 100,000 times in subsequent months.

Reddit is seeking monetary damages and a court injunction to force Anthropic to comply with its user agreement terms. The company has requested a jury trial.

In an email to AFP, Anthropic said "We disagree with Reddit's claims and will defend ourselves vigorously."

Reddit has entered into licensing agreements with other AI giants including Google and OpenAI, which allow those companies to use Reddit content under terms that protect user privacy and provide compensation to the platform.

Those deals have helped lift Reddit's share price since it went public in 2024.

Reddit shares closed up more than six percent on Wednesday following news of the lawsuit.

Musicians, book authors, visual artists and news publications have sued the various AI companies that used their data without permission or payment.

AI companies generally defend their practices by claiming fair use, arguing that training AI on large datasets fundamentally changes the original content and is necessary for innovation.

Though most of these lawsuits are still in early stages, their outcomes could have a profound effect on the shape of the AI industry.