in

AMD reveals new AI chip challenging Nvidia’s dominance

amd-reveals-new-ai-chip-challenging-nvidia’s-dominance

With its CDNA architecture and 192 GB memory capacity, the M1300X by AMD accommodates larger AI models.

On June 13, Advanced Micro Devices (AMD) revealed new details about an artificial intelligence (AI) chip that could challenge market leader Nvidia.

The California-based AMD said its most-advanced graphics processing unit (GPU) for AI, the M1300X, will start trickling out in the third quarter of 2023, with mass production beginning in the fourth quarter.

AMD’s announcement represents the most significant challenge to Nvidia, which currently dominates the market for AI chips with over 80% of the market share. GPUs are chips used by firms like OpenAI to build cutting-edge AI programs such as ChatGPT. They have parallel processing capabilities and are optimized for handling large amounts of data simultaneously, making them well-suited for tasks that require high-speed, efficient graphical processing.

AMD announced that its latest MI300X chip and CDNA architecture was specifically developed to cater to the demands of large language and advanced AI models. With a maximum memory capacity of 192 gigabytes, the M1300X accommodates even larger AI models than other chips like Nvidia’s H100 chip, which supports a maximum of 120 GB of memory.

Screenshot of AMD CEO Lisa Su speaking at an event outlining its AMD AI accelerator chip in San Francisco. Source: YouTube

AMD’s infinity architecture technology combines eight M1300X accelerators into one system, mirroring similar systems by Nvidia and Google that integrate eight or more GPUs for AI applications.

During a presentation to investors and analysts in San Francisco, AMD CEO Lisa Su highlighted that AI represents the company’s “most significant and strategically important long-term growth opportunity.“

“We think about the data center AI accelerator [market] growing from something like $30 billion this year, at over 50% compound annual growth rate, to over $150 billion in 2027.“ 

If developers and server manufacturers adopt AMD’s “accelerator” AI chips as alternatives to Nvidia’s products, it could open up a significant untapped market for the chipmaker. AMD, renowned for its conventional computer processors, stands to benefit from the potential shift in demand.

Related: AI startup by ex-Meta and Google researchers raises $113M in seed funding

Although AMD did not reveal specific pricing details, this action could exert downward price pressure on Nvidia’s GPUs, including models like the H100, which can carry price tags of $30,000 or more. Reduced GPU prices have the potential to contribute to lowering the overall expenses associated with running resource-intensive generative AI applications.

Magazine: Is AI a nuke-level threat? Why AI fields all advance at once, dumb pic puns

Leave a Reply

Your email address will not be published. Required fields are marked *

ripple-welcomes-mica-regulation-as-us-lawsuit-highlights-lack-of-clarity

Ripple welcomes MiCA regulation as US lawsuit highlights lack of clarity

top-7-it-certifications-in-demand

Top 7 IT certifications in demand