Reminiscence Makers on Observe to Double HBM Output in 2023

TrendForce tasks a exceptional 105% improve in annual bit shipments of high-bandwidth reminiscence (HBM) this 12 months. This enhance is available in response to hovering calls for from AI and high-performance computing processor builders, notably Nvidia, and cloud service suppliers (CSPs). To meet demand, Micron, Samsung, and SK Hynix are reportedly growing their HBM capacities, however new manufacturing strains will possible begin operations solely in Q2 2022.

Extra HBM Is Wanted

Reminiscence makers managed to kind of match the provision and demand of HBM in 2022, a uncommon prevalence out there of DRAM. Nevertheless, an unprecedented demand spike for AI servers in 2023 pressured builders of acceptable processors (most notably Nvidia) and CSPs to position further orders for HBM2E and HBM3 reminiscence. This made DRAM makers use all of their accessible capability and begin inserting orders for added instruments to broaden their HBM manufacturing strains to fulfill the demand for HBM2E, HBM3, and HBM3E reminiscence sooner or later.

Nevertheless, assembly this HBM demand is not one thing simple. Along with making extra DRAM units of their cleanrooms, DRAM producers must assemble these reminiscence units into intricate 8-Hello or 12-Hello stacks, and right here they appear to have a bottleneck since they don’t have sufficient TSV manufacturing instruments, in response to TrendForce. To provide sufficient HBM2, HBM2E, and HBM3 reminiscence, main DRAM producers have to acquire new tools, which takes 9 to 12 months to be made and put in into their fabs. Because of this, a considerable hike in HBM manufacturing is anticipated round Q2 2024, the analysts declare.

A noteworthy development pinpointed by TrendForce analysts is the shifting desire from HBM2e (Utilized by AMD’s Intuition MI210/MI250/MI250X, Intel’s Sapphire Rapids HBM and Ponte Vecchio, and Nvidia’s H100/H800 playing cards) to HBM3 (integrated in Nvidia’s H100 SXM and GH200 supercomputer platform and AMD’s forthcoming Intuition MI300-series APUs and GPUs). TrendForce believes that HBM3 will account for 50% of all HBM reminiscence shipped in 2023, whereas HBM2E will account for 39%. In 2024, HBM3 is poised to account for 60% of all HBM shipments. This rising demand, when mixed with its larger value level, guarantees to spice up HBM income within the close to future.

Simply yesterday, Nvidia launched a new version of its GH200 Grace Hopper platform for AI and HPC that makes use of HBM3E reminiscence as an alternative of HBM3. The brand new platform consisting of a 72-core Grace CPU and GH100 compute GPU, boasts larger reminiscence bandwidth for the GPU, and it carries 144 GB of HBM3E reminiscence, up from 96 GB of HBM3 within the case of the unique GH200. Contemplating the immense demand for Nvidia’s choices for AI, Micron — which would be the solely provider of HBM3E in 1H 2024 — stands a excessive probability to learn considerably from the freshly launched {hardware} that HBM3E powers.

HBM Is Getting Cheaper, Variety Of

TrendForce additionally famous a constant decline in HBM product ASPs every year. To invigorate curiosity and offset lowering demand for older HBM fashions, costs for HBM2e and HBM2 are set to drop in 2023, in response to the market monitoring agency. With 2024 pricing nonetheless undecided, additional reductions for HBM2 and HBM2e are anticipated as a consequence of elevated HBM manufacturing and producers’ development aspirations.

In distinction, HBM3 costs are predicted to stay secure, maybe as a result of, at current, it’s solely accessible from SK Hynix, and it’ll take a while for Samsung to catch up. Given its larger value in comparison with HBM2e and HBM2, HBM3 might push HBM income to a formidable $8.9 billion by 2024, marking a 127% YoY improve, in response to TrendForce.

SK Hynix Main the Pack

SK Hynix commanded 50% of the HBM reminiscence market in 2022, adopted by Samsung with 40% and Micron with a ten% share. Between 2023 and 2024, Samsung and SK Hynix will proceed to dominate the market, holding almost equivalent stakes that sum as much as about 95%, TrendForce tasks. Then again, Micron’s market share is anticipated to hover between 3% and 6%.

In the meantime, (for now) SK Hynix appears to have an edge over its rivals. SK Hynix is the first producer of HBM3, the one firm to produce reminiscence for Nvidia’s H100 and GH200 merchandise. As compared, Samsung predominantly manufactures HBM2E, catering to different chip makers and CSPs, and is gearing as much as begin making HBM3. Micron, which doesn’t have HBM3 within the roadmap,  produces HBM2E (which Intel reportedly makes use of for its Sapphire Rapids HBM CPU) and is on the brink of ramp up manufacturing of HBM3E in 1H 2024, which can give it a big aggressive benefit over its rivals which are anticipated to begin making HBM3E solely in 2H 2024.

Leave a Reply

Your email address will not be published. Required fields are marked *