Tough Nvidia! AMD Su Zifeng’s latest statement: We are now at the beginning of the AI ​​​​big cycle

AMD will launch the MI325 X this year, the MI350 series in 2025, and the MI400 series in 2026

AMD will launch the MI325 X this year. The MI350 series will be launched in 2025. The MI400 series will be launched in 2026.
After Nvidia CEO Jensen Huang announced the iteration schedule of the latest AI chip on the evening of June 2, AMD was not to be outdone.
On the morning of the 3rd, in a speech before the opening of COMPUTEX, AMD CEO Su Zifeng also laid out an iterative roadmap.
AMD is the world\’s second largest GPU manufacturer and one of the major CPU manufacturers. It is second only to Nvidia in the GPU field and is Intel\’s competitor in the CPU field.
Based on this, AMD is also developing a GPU+CPU combination.
Su Zifeng said that AMD will launch MI325 X in the fourth quarter of this year. It will be equipped with HBM3E (high bandwidth memory) memory. The memory will be larger and the computing power will be improved.
The MI350 series and MI400 series will be launched in the next two years.
Among them, MI300 X and MI325 X adopt CDNA3 architecture. MI350 will adopt CDNA4 architecture. MI400 will adopt the next generation CDNA architecture.
From the perspective of the industry, this speed is in line with the plan released by NVIDIA.
The demand for artificial intelligence is accelerating. We are at the beginning of a decade-long artificial intelligence cycle.
Su Zifeng said that last year AMD launched the MI300 X accelerator. It will launch new product series every year.
Specifically, the MI325 X that AMD will launch this year has 288GB of high-speed HBM3E memory. The memory bandwidth reaches 6TB per second.
Su Zifeng said that a single server equipped with 8 MI325 X accelerators can run large models with up to 1 trillion parameters. This is twice the model size that a server equipped with NVIDIA H200 can support.
In 2025, the CDNA4 architecture that AMD will launch will bring the largest generational leap in artificial intelligence in the company’s history.
MI350 uses advanced 3nm process. It supports FP4 (four-bit floating point number) and FP6 data types.
When we look back at the past, when AMD launched CDNA3, the artificial intelligence performance was 8 times that of the previous generation. And the performance of CDNA4 will be 35 times higher than that of CDNA3.
Su Zifeng said that the memory of MI350 will be 1.5 times that of B200, and the performance will be improved by 1.2 times.
H200 and B200 are both Nvidia’s AI chips. They will be released in 2023 and 2024 respectively.
Among them, B200 adopts Blackwell architecture. NVIDIA integrates two B200 CPU chips and one Grace CPU chip on a GB200 motherboard. It uses interconnection technology to improve performance.
According to Huang Renxun on the evening of the 2nd, Nvidia will update once a year. Blackwell Ultra will be launched in 2025, new architecture Rubin will be launched in 2026, and Rubin Ultra will be launched in 2027.
AMD, which also updates every year, will compete directly with Nvidia.
The processors in most data centers are now more than 5 years old. Many enterprises want to update the computing infrastructure of the data center and add AI capabilities.
Many enterprise customers also want to perform general computing and artificial intelligence computing without adding a GPU.
AMD is the only company that can provide a complete set of CPU and GPU networking solutions for data centers.
Su Zifeng said.
Su Zifeng said that AMD will launch the fifth generation EPYC CPU processor for data centers, codenamed Turin.
The processor is based on the Zen5 architecture. It will be launched in the second half of this year. The flagship product has 192 Zen5 cores and 384 threads.
Su Zifeng introduced. Turin’s performance advantages are outstanding when running smaller, larger language models.
For desktop computers. Su Zifeng also released the AMD Ryzen 9000 series desktop processors. This series uses Zen5 architecture. The first batch of products are Ryzen 9 9950X, Ryzen 9 9900X, Ryzen 7 9700X and Ryzen 5 9600X. Will be launched in July Listed.
In addition, AMD launched the Ryzen AI 300 series, code-named Strix Point, for the notebook computer field.
According to Su Zifeng, the Ryzen AI 300 series uses Zen 5 architecture and can run AI workloads locally.
Ryzen AI 300 series is equipped with XDNA AI NPU (neural processing unit). The NPU computing power can reach 50 TOPS.
Compared with other new x86 and ARM CPUs in the same industry, this series has higher performance in single-thread response, content creation, and multi-tasking.
After Huang Jen-Hsun arrived in Taipei earlier, he invited the heads of supply chain partners Hon Hai Group, Quanta Computer, Asus, Wistron and other companies to have a dinner.
AMD also created a circle of friends during the launch event.
Su Zifeng invited the heads of companies from Microsoft, HP, Asus, and Lenovo to the stage to share the cooperation and AI applications between the two parties.
The relevant person in charge of Microsoft said on the stage that Copilot (AI assistant) + PC can provide AI services locally on the PC and on the cloud. This will have faster response times and lower costs. But this requires each Copilot + PC device Can support at least 40TOPS computing power.
It is worth noting that in the face of Nvidia’s strength in the AI ​​field, AMD and other technology manufacturers have recently been trying to enhance their voice.
Eight technology giants, including Google, Meta, AMD, Intel, Broadcom, Cisco, and HP, recently announced the establishment of a new industry organization, the UALink Promoter Group, intended to formulate industry standards and guidance. The development of connectivity components between AI accelerator chips in data centers.
In a very short time, the technology industry has embraced the challenges revealed by AI and HPC.
In the process of pursuing efficiency and performance improvement, accelerators, especially GPU interconnects, require a comprehensive perspective.
J Metz, chairman of the Ultra Ethernet Alliance, said.

Like (0)
Previous June 3, 2024 6:18 pm
Next June 3, 2024 6:18 pm

Related posts