Nvidia’s inventory value took a hammering just lately, nevertheless it’s nonetheless the massive beast within the synthetic intelligence (AI) {hardware} market. The most recent buzzwords to hear out for had been “agentic AI” and “reasoning” as the corporate introduced its AI Information Platform, which had the storage suppliers trailing round after it.
Core to Nvidia’s bulletins at its latest GTC 2025 occasion in San Jose, California, was its next-generation Blackwell Extremely graphics processing unit (GPU) for AI datacentre processing, which it says is designed for reasoning fashions reminiscent of DeepSeek R1 and boosts reminiscence and inference efficiency.
And with Blackwell Extremely on the core, Nvidia additionally appeared ahead to a slew of rack-scale platform merchandise in its GB/NVL line that incorporate it, plus new DGX household SuperPod clusters, workstations, community interface playing cards (NICs), GPUs for laptops, and so forth.
That is all a little bit of a pushback to the revelation that DeepSeek is extra environment friendly and fewer GPU-hungry than beforehand seen in, for instance, ChatGPT. Nvidia has used such information to claim that we’re going to wish much more quick AI processing to profit from it.
After all, the massive storage suppliers want these sorts of enter/output necessities like pharmaceutical corporations want illness. The requirement to course of huge quantities of knowledge for AI coaching and inference brings the necessity for storage, a lot of its, and with the flexibility to ship very excessive speeds and volumes of entry to knowledge.
So, core to bulletins at GTC 2025 for storage was the Nvidia AI Information Platform reference structure, which permits third-party suppliers – with storage gamers key amongst them – to construct their equipment to the GPU large’s specs for the workloads that can run on it, that embody agentic and reasoning strategies.
These namechecked as working with Nvidia embody DDN, Dell, HPE, Hitachi Vantara, IBM, NetApp, Pure Storage, Huge Information and Weka.
In barely extra element, the bulletins by these storage gamers round GTC included the next.
DDN launched its Inferno quick object equipment, which provides Nvidia’s Spectrum-X swap to DDN Infinia storage. Infinia relies on a key:worth retailer with entry protocols on high, however presently just for S3 object storage.
Dell introduced a complete vary of issues, together with 20-petaflop-scale PCs geared toward AI use instances. In storage, it centered on its PowerScale scale-out file system now being validated for Nvidia’s Cloud Accomplice Program enterprise AI manufacturing facility deployment.
HPE made an enormous deal of its new “unified knowledge layer” that can embody structured and unstructured knowledge throughout the enterprise, whereas it introduced some upgrades, specifically unified block and file entry in its MP B10000 array.
Hitachi Vantara took the chance to launch the Hitachi iQ M Collection, which mixes its Digital Storage Platform One (VSP One) storage and Nvidia AI Enterprise software program and which is able to combine the Nvidia AI Information Platform reference design, geared toward agentic AI.
IBM introduced new collaborations with Nvidia that included deliberate integrations primarily based on the Nvidia AI Information Platform reference design. IBM plans to launch a content-aware storage functionality for its hybrid cloud infrastructure providing, IBM Fusion, and can develop its Watsonx integrations. Additionally, it plans new IBM Consulting capabilities for AI buyer initiatives.
NetApp introduced Nvidia validation for SuperPOD. Specifically, the AFF A90 product will get DGX SuperPOD validation. In the meantime, NetApp’s AIPod has received the brand new Nvidia-Licensed Storage designation to help Nvidia Enterprise Reference Architectures.
Pure Storage, sizzling on the heels of its FlashBlade//Exa announcement, took the chance to disclose compatibility with the Nvidia AI Information Platform.
Huge Information launched its enterprise-ready AI Stack, which mixes Huge’s InsightEngine and Nvidia DGX merchandise, BlueField-3 DPUs, and Spectrum-X networking.
Weka introduced it had achieved knowledge retailer certification for Nvidia GB200 deployments. WEKApod Nitro Information Platform Home equipment have been licensed for Nvidia Cloud Accomplice (NCP) deployments with HGX H200, B200 and GB200 NVL72 merchandise.