Asia Tech Wire (June 3) -- Advanced Micro Devices Inc. (AMD) is accelerating the rollout of new artificial intelligence processors in a bid to undercut Nvidia Corp.'s dominance in the lucrative market.
The MI325X will be available in the fourth quarter, AMD Chief Executive Officer Lisa Su said June 3 during the the Computex 2024 opening keynote.
The chip is the successor to the MI300 and will feature more memory and faster data throughput.
And the MI350 will be available in 2025 and the MI400 a year later.
The release cycle for the new chips will be adjusted to roughly once a year, in sync with Nvidia's release cycle.
Nvidia's product plans were presented the night before during the keynote address by Nvidia CEO Jensen Huang.
AMD said the MI350 is expected to perform 35 times better in inference (the process of calculating generative AI responses) compared to the MI300 series of AI chips currently on the market.
The MI400 series will be based on an architecture called Next.
AMD is also an important manufacturer of AI chips. Many vendors in the industry are currently investing large amounts of money in new AI training systems, and these funds are mainly used to buy Nvidia's AI chips.
Su said AMD's existing MI300 chip demand remains strong, and the new products will be comparable to Nvidia's products.
The industry believes that this product planning by AMD could be the biggest step forward in its efforts to catch up with Nvidia.
Previously, Su said in April that the company expects AI chip sales to reach about $4 billion in 2024, a $500 million increase from previous expectations, but still dwarfed by Nvidia.
Nvidia's data center division alone is expected to generate more than $100 billion in annual sales, which is also more than the combined annual revenue of AMD and Intel, according to estimates.
AMD's stock price has more than doubled since the beginning of 2023. But that gain is still not enough compared to the more than seven-fold rise in Nvidia's stock price over the same period.
During the keynote, AMD talked about its third-generation Ryzen AI processor for the consumer end of the market, calling it Strix Point, which will go on sale in July.
Tailored for laptops, the processor combines RDNA 3.5 mobile graphics, the XDNA 2 neural processing unit for accelerating AI tasks and the latest Zen 5 processing core.
During the presentation, Su invited a number of AMD's partners to the stage to interact with her, including Enrique Lores, CEO of Hewlett-Packard; Luca Rossi, executive vice president of Lenovo; and Jonney Shih, chairman of ASUSTeK Computer, to discuss upcoming laptops powered by AMD's new Ryzen platform processors.
AMD said the latest Ryzen processors outperform Qualcomm's Snapdragon X Elite in AI tasks, and the latter is in the spotlight as the centerpiece of Microsoft's new Copilot+ PCs.
Pavan Davuluri, Microsoft's head of Windows, also joined Su on stage, saying that his team has been working with AMD "since day one" on the Copilot PC+ project.
"On-device AI really for us means faster response times, better privacy and cost," Davuluri said, "But that means running models that have billions of parameters in them on PC hardware. Compared to traditional PCs even from just a few years ago, we're talking 20 times the performance and up to 100 times the efficiency for AI workloads."
Additionally, AMD showed off new gaming processors for laptops and desktops. "This is the fastest consumer CPU in the world," Su said, holding AMD's Ryzen 9 9950X chip. The 16-core processor runs at up to 5.7GHz in accelerated mode.
Nvidia's next-generation artificial intelligence (AI) chip platform, called Rubin, will launch in 2026, Jensen Huang said in a pre-event keynote address during Computex 2024 on Sunday.
The Rubin family of chips will include new graphics processing units (GPUs) and central processing units (CPUs), as well as networking chips, Huang said at Computex 2024.
It is said that Nvidia's new CPUs will be named Versa, and the new graphics chips used to drive AI applications will use next-generation, high-bandwidth memory produced by Hynix, Micron, Samsung and others.
Huang said Nvidia plans to release a new AI chip every year, a step up from its previous release schedule of roughly every two years.
Currently, Nvidia occupies about 80% of the global AI chip market, and is both the biggest driver and the biggest beneficiary of AI growth.
Notably, Huang and Su are also known to be distant relatives.
According to a family relationship chart circulating on the Internet, Huang should be Su's first cousin once removed (second uncle). Huang's grandfather, Su's great-grandfather, has at least 12 children.
In fact, both Huang and Su were born in Tainan, Taiwan.When Huang was four years old, he immigrated with his family to Oregon, USA. Similarly, Su's family immigrated to the U.S. when she was three years old.
In the end, the two Chinese-Americans created a new myth in the U.S.