AMD Launches New AI Chips to Take on Leader Nvidia 

Lisa Su, chairwoman and CEO of Advanced Micro Devices (AMD), delivers the opening keynote speech at Computex 2024, Taiwan's premier tech expo, in Taipei on June 3, 2024. (AFP)
Lisa Su, chairwoman and CEO of Advanced Micro Devices (AMD), delivers the opening keynote speech at Computex 2024, Taiwan's premier tech expo, in Taipei on June 3, 2024. (AFP)
TT

AMD Launches New AI Chips to Take on Leader Nvidia 

Lisa Su, chairwoman and CEO of Advanced Micro Devices (AMD), delivers the opening keynote speech at Computex 2024, Taiwan's premier tech expo, in Taipei on June 3, 2024. (AFP)
Lisa Su, chairwoman and CEO of Advanced Micro Devices (AMD), delivers the opening keynote speech at Computex 2024, Taiwan's premier tech expo, in Taipei on June 3, 2024. (AFP)

Advanced Micro Devices unveiled its latest artificial intelligence processors on Monday and detailed its plan to develop AI chips over the next two years in a bid to challenge industry leader Nvidia.

At the Computex technology trade show in Taipei, AMD CEO Lisa Su introduced the MI325X accelerator, which is set to be made available in the fourth quarter of 2024.

The race to develop generative artificial intelligence programs has led to towering demand for the advanced chips used in AI data centers able to support these complex applications.

Santa Clara, California-based AMD has been vying to compete against Nvidia, which currently dominates the lucrative market for AI semiconductors and commands about 80% of its share.

Since last year, Nvidia has made it clear to investors that it plans to shorten its release cycle to annually, and now AMD has followed suit.

"AI is clearly our number one priority as a company and we have really harnessed all of the development capability within the company to do that," Su told reporters.

"This annual cadence is something that is there because the market requires newer products and newer capabilities... Every year we have the next big thing such that we always have the most competitive portfolio."

AMD also introduced an upcoming series of chips titled MI350, which is expected to be available in 2025 and will be based on new chip architecture.

Compared to the currently available MI300 series of AI chips, AMD said it expects the MI350 to perform 35 times better in inference - the process of computing generative AI responses. Additionally, AMD revealed the MI400 series, which will arrive in 2026 and will be based on an architecture called "Next".

Investors who have poured billions of dollars into Wall Street's picks-and-shovels trade have been seeking longer-term updates from chip firms, as they evaluate the longevity of the booming genAI rally, which so far has shown no signs of slowing down.

AMD's shares have more than doubled since the start of 2023. This surge still pales in comparison to the more than seven-fold rise in Nvidia's shares in the same time period.

AMD is aiming at an AI chip product cycle of one year. Similarly, Nvidia said it plans to release a new family of AI chips every year.

AMD's Su said in April that the company expects AI chip sales of roughly $4 billion for 2024, an increase of $500 million from its prior estimate.

At the Computex event, AMD also said its latest generation of central processor units (CPUs) will likely be available in the second half of 2024.

While businesses generally prioritize spending on AI chips in data centers, some of AMD's CPUs are used in conjunction with graphics processor units, though the ratio is skewed in favor of GPUs.

AMD detailed architecture for its new neural processing units (NPUs), which are dedicated to handling on-device AI tasks in AI PCs.

Chipmakers have been banking on added AI capabilities to drive growth in the PC market as it emerges from a years-long slump.

PC providers such as HP and Lenovo will release devices which include AMD's AI PC chips. AMD said its processors exceed Microsoft's Copilot+ PC requirements.



AI Chatbots Must Learn to Say 'Help!' Says Microsoft Exec

A Microsoft logo is seen in Los Angeles, California US November 7, 2017. (Reuters)
A Microsoft logo is seen in Los Angeles, California US November 7, 2017. (Reuters)
TT

AI Chatbots Must Learn to Say 'Help!' Says Microsoft Exec

A Microsoft logo is seen in Los Angeles, California US November 7, 2017. (Reuters)
A Microsoft logo is seen in Los Angeles, California US November 7, 2017. (Reuters)

Generative AI tools will save companies lots of time and money, promises Vik Singh, a Microsoft vice president, even if the models must learn to admit when they just don't know what to do.
"Just to be really frank, the thing that's really missing today is that a model doesn't raise its hands and say 'Hey, I'm not sure, I need help,'" Singh told AFP in an interview.
Since last year, Microsoft, Google and their competitors have been rapidly deploying generative AI applications like ChatGPT, which produce all kinds of content on demand and give users the illusion of omniscience.
But despite progress, they still "hallucinate," or invent answers.
This is an important problem for the Copilot executive to solve: Singh's corporate customers can't afford for their AI systems to go off the rails, even occasionally.
Marc Benioff, CEO of Salesforce, this week said he saw many of his customers increasingly frustrated with the meanderings of Microsoft's Copilot.
Singh insisted that "really smart people" were trying to find ways for a chatbot to admit "when it doesn't know the right answer and to ask for help."
'Real savings'
A more humble model would be no less useful, in Singh's opinion. Even if the model has to turn to a human in 50 percent of cases, that still saves "tons of money."
At one Microsoft client, "every time a new request comes in, they spend $8 to have a customer service rep answer it, so there are real savings to be had, and it's also a better experience for the customer because they get a faster response."
Singh arrived at Microsoft in January and this summer took over as head of the teams developing "Copilot," Microsoft's AI assistant that specializes in sales, accounting and online services.
These applications have the gargantuan task of bringing in revenue and justifying the massive investments in generative AI.
At the height of the AI frenzy, start-ups driving the technology were promising systems so advanced that they would "uplift humanity," in the words of Sam Altman, head of OpenAI, which is mainly funded by Microsoft.
But for the time being, the new technology is mainly used to boost productivity, and hopefully profits.
According to Microsoft, Copilot can do research for salespeople, freeing up time to call customers. Lumen, a telecom company, "saves around $50 million a year" doing this, said Singh.
Singh's teams are working on integrating Copilot directly into the tech giant's software and making it more autonomous.
"Let's say I'm a sales rep and I have a customer call," suggested the executive. Two weeks later, the model can "nudge the rep to go follow up, or better, just go and automatically send the email on the rep's behalf because it's been approved to do so."
'First inning'
In other words, before finding a solution to global warming, AI is expected to rid humanity of boring, repetitive chores.
"We're in the first inning," Singh said. "A lot of these things are productivity based, but they obviously have huge benefits."
Will all these productivity gains translate into job losses?
Leaders of large firms, such as K Krithivasan, boss of Indian IT giant TCS, have declared that generative AI will all but wipe out call centers.
But Singh, like many Silicon Valley executives, is counting on technology to make humans more creative and even create new jobs.
He pointed to his experience at Yahoo in 2008, when a dozen editors chose the articles for the home page.
"We came up with the idea of using AI to optimize this process, and some people asked 'Oh my God, what's going to happen to the employees?'" said Singh.
The automated system made it possible to renew content more quickly, thereby increasing the number of clicks on links but also the need for new articles.
"In the end," said the executive, "we had to recruit more editors."