The demand for HBM soars, and DRAM prices rebound by 8%

DRAM heat is coming back.

The surge in demand for high-bandwidth memory (HBM) for generative AI, coupled with a supply shortage, has led to the first increase in DRAM prices in three months, with an 8% rebound.

Japanese media reported that the sharp increase in HBM demand has driven the price of DRAM, which is used for temporary data storage in smartphones, PCs, and data center servers. In May 2024, the benchmark product DDR4 8Gb wholesale price (bulk transaction price) was around $2.10 each, and the price for the smaller-capacity 4Gb product was around $1.62 each, both up about 8% from the previous month, marking the first increase in three months. As of April 2024, DRAM prices have remained flat for two consecutive months. DRAM wholesale prices are negotiated between memory manufacturers and customers on a monthly or quarterly basis.

The report pointed out that HBM is a memory that stacks multiple DRAM chips and can process large amounts of data at high speed, and most of it is supplied to Nvidia, which produces GPUs for generative AI. The strong demand for HBM is the main reason for the rebound in DRAM wholesale prices in May. An electronics trading company pointed out that the supply of HBM cannot keep up with demand, and there is a shortage of quantity.

The head of another electronics trading company said, "To increase HBM supply and reduce DRAM production, it also causes the price of traditional DRAM to rise." The facilities required to produce HBM are about three times larger than those required to produce traditional DRAM. If HBM production increases, other DRAM production will decrease.

Advertisement

The World Semiconductor Trade Statistics Association (WSTS) released a forecast report on June 4, stating that due to the continuous strong investment in global AI, the demand for memory and some logic chips is rapidly expanding. Therefore, it revised the global semiconductor sales forecast for this year (2024) from the previous estimate of $588.364 billion on November 28, 2023, to $611.231 billion, an increase of 16.0% year-on-year, surpassing the $574.084 billion in 2022 and setting a new historical record.

WSTS revised the global sales forecast for memory in 2024 from the previous estimate of $129.768 billion to $163.153 billion, an increase of 76.8%.According to reports, Micron is expanding its High Bandwidth Memory (HBM) related R&D facilities at its headquarters in Boise, Idaho, which includes production and verification lines. In addition, Micron is also considering building HBM production capacity in Malaysia, where the company already has chip testing and assembly plants.

Micron's largest HBM production base is located in Taichung City, Taiwan, which is also in the process of expansion. Micron has set a target to increase its HBM market share to 24-26% by the end of 2025, which is not far from the company's traditional DRAM market share (about 23-25%).

According to BoBanTang, Samsung's SSD and memory products have just ended the promotional activities of 618. It is possible that the product prices will be increased in the third quarter, with the expected increases as follows:

SSD - 15% increase

DDR4 - 10% increase

DDR5 - 15% increase

eMMC - The increase is not yet clear

As one of the world's largest chip manufacturers, the semiconductor business has always been Samsung's cash cow. However, in 2023, the global storage chip market fell into an unprecedented slump, causing Samsung to suffer heavy losses, and it has been struggling with inventory and market demand for several consecutive quarters, even falling into continuous losses. Due to the heavy burden of inventory, low demand, and falling prices, Samsung had to focus on profitability by reducing production and other means. After a long period of struggle, Samsung's DRAM and NAND flash memory business finally saw the light of hope after entering 2024, and the monthly financial report in January this year showed that it has achieved profitability.

Although the market demand is gradually weakening, both the DRAM and NAND flash memory markets face significant challenges in pricing, and coupled with the cessation of production reduction by memory manufacturers, the industry expects that prices will not increase in the short term. However, Samsung still seems to be determined to continue raising the prices of memory and SSD products.The average capacity of NB DRAM is expected to increase by at least 7% in 2025.

TrendForce's observations indicate that in 2024, major cloud service providers (CSPs) such as Microsoft, Google, Meta, and AWS will continue to be the main customers for purchasing high-end AI servers primarily used for training, serving as the foundation for large language models (LLMs) and AI modeling. Once these CSPs have gradually established a certain number of AI training server infrastructures in 2024, they will more actively expand from cloud to edge AI in 2025. This includes the development of smaller LLM models and the construction of edge AI servers, promoting the application of their enterprise customers in various fields such as manufacturing, finance, healthcare, and business.

Furthermore, since AI PCs or notebooks have a basic architectural composition similar to AI servers and possess a certain level of computing power capable of running smaller LLMs and generative AI (GenAI) applications, they are expected to become the last mile for CSPs to connect cloud AI infrastructure with small-scale edge AI training or inference applications.

Looking at the recent chip manufacturers' layout in AI PCs, Intel's Lunar Lake and AMD's Strix Point are both SoCs that were first released at the COMPUTEX exhibition and meet the AI PC standards. Therefore, even though the PC models equipped with Lunar Lake and Strix Point are expected to be launched in the fourth quarter of this year and the third quarter of this year, respectively, PC OEM venue staff only verbally explained the specifications and performance of these two types of models. However, these two SoCs and the equipped PC models are still the main focus of market attention at the event.

In terms of brands, ASUS and Acer, after launching models equipped with Qualcomm Snapdragon X Elite in May, also launched models equipped with Lunar Lake and Strix Point during the COMPUTEX exhibition. MSI only launched models equipped with Lunar Lake and Strix Point; the suggested retail price for the new AI PC models is in the range of US$1,399-1,899.

In 2025, with the improvement of AI applications, the ability to handle complex tasks, provide a better user experience, and improve productivity, the demand for more intelligent and efficient terminal devices among consumers will grow rapidly. The penetration rate of AI notebooks is expected to quickly rise to 20.4%, and the AI notebook wave is also expected to drive the growth of DRAM content.

The average capacity of NB DRAM will increase by 12% year-on-year from 10.5GB in 2023 to 11.8GB in 2024. Looking forward to 2025, as the AI notebook penetration rate increases from 1% in 2024 to 20.4% in 2025, and all AI notebooks are equipped with more than 16GB of DRAM, it will drive the overall average capacity to increase by at least 0.8GB, with an increase of at least 7%.

In addition to driving the increase in the average capacity of NB DRAM, AI notebooks will also drive the demand for energy-saving, high-frequency memory. In this situation, compared to DDR, LPDDR can better highlight its advantages, thus accelerating the trend of LPDDR replacing DDR. For the original DDR, SO-DIMM solution that pursues scalability, switching to the same module LPDDR, LPCAMM solution is also one of the options in addition to LPDDR on board.