Solidigm tackles AI memory bottleneck with dense storage
AI servers consume roughly eight times more memory than traditional servers, creating performance bottlenecks that threaten to slow AI inferencing and model deployment. Solidigm is addressing this through high-density Quad-Level Cell NAND technology and a 122-terabyte SSD, with plans to double that capacity, improving energy efficiency for data centers.
The company is positioning itself at the intersection of storage and AI compute, capitalizing on surging inference demand highlighted by Nvidia's Jensen Huang at GTC. Solidigm's floating gate NAND approach enables higher bits-per-cell density while preventing data disruption across adjacent cells, critical for large-scale AI workloads.
