Increased demand for Nvidia GPUs on ChatGPT has driven up the price of graphics memory, or DRAM, from companies including Samsung and SK Hynix. It looks like the explosion of ChatGPT’s audience and the surge of OpenAI into Nvidia’s GPUs will disrupt the GPU peace.
- Nvidia’s CEO sees ChatGPT as the invention of the first iPhone for AI
ChatGPT is the main cause of this sudden price increase
In the past few weeks, Nvidia GPUs have become the most popular choice for AI tools like ChatGPT, and the company’s CEO, Jensen Huang, has called it the biggest innovation in the computing landscape. Now, makers of DRAM graphics memory are starting to increase the price of high-bandwidth memory, or HBM solutions, used to power Nvidia’s AI GPUs.

A report from South Korean media BusinessKorea It shows that DRAM makers such as SK Hynix and Samsung have increased the prices of their memory solutions including HBM. Nvidia is said to have asked SK Hynix to increase its HBM3 production capacity. But other vendors, such as Intel, are also looking to integrate HBM3 into their next-generation products, which means Sk Hynix may not be able to keep up with demand. Thus, the price of HBM memory, especially the latest generation of HBM3, has increased up to 5 times.
The emergence of ChatGPT, an artificial intelligence (AI) chatbot, provides opportunities for Korean memory semiconductor manufacturers to create new business. ChatGTP learns a lot of data through super-big artificial intelligence (AI) and answers questions naturally. DRAM data processing speed is important for better and faster ChatGTP service. Korean companies are producing high-performance DRAMs that are essential for this task.
Nvidia, the world’s largest graphics processor company, has asked SK Hynix to supply its latest product, the HBM3 chips. Intel, the world’s No. 1 server CPU maker, is also trying hard to sell products equipped with SK Hynix’s HBM3. “The price of HBM3 increased by five times compared to the highest performance DRAM,” said an industry source.
But it’s not just HBM3 prices that have gone up, older HBM standards like HBM2 and HBM2e are also seen in the latest generation of Nvidia GPUs like Ampere and Volta, which also offer advanced AI capabilities. These chips are also in high demand in the AI industry and are useful for tools like ChatGPT. SK Hynix alone has the majority of the HBM market with a 60-70% share.
Recently, analysts and industry insiders have stated that ChatGPT’s success could benefit Nvidia entirely. OpenAI, the creators of ChatGPT, currently uses around 25,000 Nvidia GPUs to power their current server needs, and with increasing demand and more competitive solutions, it looks like NVIDIA GPUs, known for offering the best AI capabilities , they can definitely become jewels. A logical choice and that is why industry analysts have predicted that Nvidia can fully expect a sharp increase in demand in the coming quarters.
Related posts:
- The largest revenue decline in the DRAM market since 2008 with a 30% drop in sales
- SK hynix released the first 24Gb DDR5 DRAMs
- ChatGPT’s fastest growing rate with 100 million active users in the history of the Internet
- Review of artificial intelligence laws in the European Union and the possibility of restrictions on ChatGPT