SK Hynix, Sandisk Collaborate to Standardize Next-Gen Memory for AI Inference

  • SK Hynix and Sandisk have launched a joint workstream under the Open Compute Project (OCP) to standardize a new memory solution called High Bandwidth Flash (HBF).
  • HBF is positioned as a memory layer between High Bandwidth Memory (HBM) and Solid State Drives (SSDs), designed to improve scalability and power efficiency for AI inference workloads.
  • The standardization effort aims to establish HBF as an industry standard, fostering growth within the AI ecosystem.
  • Industry forecasts suggest demand for complex memory solutions like HBF will increase around 2030.
  • The initiative leverages SK Hynix and Sandisk's existing expertise in HBM and NAND flash memory design, packaging, and mass production.

The shift in the AI industry from model training to inference creates a critical need for faster and more efficient memory solutions. SK Hynix and Sandisk's collaboration to standardize HBF addresses this need by filling a performance and capacity gap between existing HBM and SSD technologies. This move signals a broader trend towards system-level optimization in AI infrastructure, where memory solutions are becoming increasingly integral to overall performance and cost-effectiveness.

Adoption Rate
The pace at which HBF gains traction within the AI inference market will depend on the willingness of hardware vendors and cloud providers to adopt the new standard, potentially delaying widespread deployment beyond 2030.
Competitive Response
Other memory manufacturers will likely respond to the HBF standardization effort, potentially leading to competing memory architectures and fragmenting the AI inference memory landscape.
Ecosystem Integration
How effectively SK Hynix and Sandisk can integrate HBF into existing CPU, GPU, and memory architectures will be critical to realizing the promised TCO reductions and scalability improvements.