Nvidia Supplier SK Hynix Says HBM Chips Almost Sold Out for 2025 

Employees walk past identification systems bearing the logos of SK Hynix at its headquarters in Seongnam, South Korea, April 25, 2016. (Reuters) 
Employees walk past identification systems bearing the logos of SK Hynix at its headquarters in Seongnam, South Korea, April 25, 2016. (Reuters) 
TT

Nvidia Supplier SK Hynix Says HBM Chips Almost Sold Out for 2025 

Employees walk past identification systems bearing the logos of SK Hynix at its headquarters in Seongnam, South Korea, April 25, 2016. (Reuters) 
Employees walk past identification systems bearing the logos of SK Hynix at its headquarters in Seongnam, South Korea, April 25, 2016. (Reuters) 

South Korea's SK Hynix said on Thursday that its high-bandwidth memory (HBM) chips used in AI chipsets were sold out for this year and almost sold out for 2025 as businesses aggressively expand artificial intelligence services.

The Nvidia supplier and the world's second-largest memory chipmaker will begin sending samples of its latest HBM chip, called the 12-layer HBM3E, in May and begin mass producing them in the third quarter.

"The HBM market is expected to continue to grow as data and (AI) model sizes increase," Chief Executive Officer Kwak Noh-Jung told a news conference. "Annual demand growth is expected to be about 60% in the mid-to long-term."

SK Hynix which competes with US rival Micron and domestic behemoth Samsung Electronics in HBM was until March the sole supplier of HBM chips to Nvidia, according to analysts who add that major AI chip purchasers are keen to diversify their suppliers to better maintain operating margins. Nvidia commands some 80% of the AI chip market.

Micron has also said its HBM chips were sold out for 2024 and that the majority of its 2025 supply was already allocated. It plans to provide samples for its 12-layer HBM3E chips to customers in March.

"As AI functions and performance are being upgraded faster than expected, customer demand for ultra-high-performance chips such as the 12-layer chips appear to be increasing faster than for 8-layer HBM3Es," said Jeff Kim, head of research at KB Securities.

Samsung Electronics, which plans to produce its HBM3E 12-layer chips in the second quarter, said this week that this year's shipments of HBM chips are expected to increase more than three-fold and it has completed supply discussions with customers. It did not elaborate further.

Last month, SK Hynix announced a $3.87 billion plan to build an advanced chip packaging plant in the US state of Indiana with an HBM chip line and a 5.3 trillion won ($3.9 billion) investment in a new DRAM chip factory at home with a focus on HBMs.

Kwak said investment in HBM differed from past patterns in the memory chip industry in that capacity is being increased after making certain of demand first.

By 2028, the portion of chips made for AI, such as HBM and high-capacity DRAM modules, is expected to account for 61% of all memory volume in terms of value from about 5% in 2023, SK Hynix's head of AI infrastructure Justin Kim said.

Last week, SK Hynix said in a post-earnings conference call that there may be a shortage of regular memory chips for smartphones, personal computers and network servers by the year's end if demand for tech devices exceeds expectations.



Nvidia CEO Says Global Cooperation in Tech will Continue under Trump Administration

Nvidia CEO Jensen Huang poses for a photo after receiving an honorary doctorate in engineering from the Hong Kong University of Science and Technology, in Hong Kong on November 23, 2024. (Photo by Holmes CHAN / AFP)
Nvidia CEO Jensen Huang poses for a photo after receiving an honorary doctorate in engineering from the Hong Kong University of Science and Technology, in Hong Kong on November 23, 2024. (Photo by Holmes CHAN / AFP)
TT

Nvidia CEO Says Global Cooperation in Tech will Continue under Trump Administration

Nvidia CEO Jensen Huang poses for a photo after receiving an honorary doctorate in engineering from the Hong Kong University of Science and Technology, in Hong Kong on November 23, 2024. (Photo by Holmes CHAN / AFP)
Nvidia CEO Jensen Huang poses for a photo after receiving an honorary doctorate in engineering from the Hong Kong University of Science and Technology, in Hong Kong on November 23, 2024. (Photo by Holmes CHAN / AFP)

Nvidia CEO Jensen Huang said on Saturday that global cooperation in technology will continue even if the incoming US administration imposes stricter export controls on advanced computing products.
US President-elect Donald Trump, in his first term in office, imposed restrictions on the sale of US technology to China citing national security - a policy continued under President Joe Biden. The curbs forced Nvidia, the world's leading maker of chips used for artificial intelligence applications, to change its product lineup in China.
"Open science in global collaboration, cooperation across math and science has been around for a very long time. It is the foundation of social advancement and scientific advancement," Huang told media during a visit to Hong Kong.
Cooperation is "going to continue. I don't know what's going to happen in the new administration, but whatever happens, we'll balance simultaneously compliance with laws and policies, continue to advance our technology and support and serve customers all over the world."
The head of the world's most valuable company was speaking in the financial hub after receiving an honorary doctorate in engineering from the Hong Kong University of Science and Technology, Reuters reported.
During the visit, Huang participated in a fireside chat with the university's Council Chairman Harry Sham in front of an audience of students and academics.
Asked about the huge energy requirements of graphics processing units - chips behind artificial intelligence - Huang said, "If the world uses more energy to power the AI factories of the world, we are a better world when that happens".
Huang said "the goal of AI is not for training, the goal of AI is for inference". He said AI can discover, for instance, new ways to store carbon dioxide in reservoirs, new wind turbine designs and new materials for storing electricity.
He said people should start thinking about placing AI supercomputers slightly off the power grid and let them use sustainable energy and in places away from populations.
"My hopes and dreams is that in the end, what we all see is that using energy for intelligence is the best use of energy we can imagine," Huang said.
Earlier on Saturday, Huang told graduates that "the age of AI has started" in a speech after receiving the honorary degree.
"A new computing era that will impact every industry and every field of science."