According to the latest survey by market research firm TrendForce, the demand for AI servers continues to accelerate the development of HBM technology, and all three major suppliers are actively promoting their HBM4 product roadmap.
TrendForce points out that HBM4 introduces more complex chip designs, and due to a significant increase in the number of I/Os, the chip size also increases accordingly. In addition, some suppliers are also shifting towards logic based substrate architectures to improve performance, both of which have led to an increase in production costs. As a reference, the initial price premium of HBM3e is expected to be 20%; The higher manufacturing difficulty of HBM4 is expected to result in a premium of over 30%.

TrendForce points out that compared to previous generations, the HBM4 has doubled its I/O count from 1024 to 2048, while maintaining a data transfer rate of over 8.0 Gbps. This means that due to the increase in the number of channels, HBM4 can provide twice the data throughput at the same speed.
Driven by strong demand, TrendForce predicts that the total HBM shipments will exceed 30 billion Gbps by 2026. As suppliers increase production, HBM4's market share is expected to steadily grow and eventually surpass HBM3e as the mainstream solution in the second half of 2026. It is expected that SK Hynix will maintain its leading position with over 50% market share, while Samsung and Micron need to further improve yield and production capacity to narrow the gap in the HBM4 competition.