WEKA, NVIDIA Launch Platform to Industrialize Enterprise AI
- 6.5x efficiency gain: WEKA's platform achieves 6.5 times more tokens per GPU for inference workloads, boosting economic viability of large-scale AI.
- Exabyte-scale data handling: NeuralMesh™ AI Data Platform is designed to manage data volumes growing into the exabyte range.
- Turnkey deployment: The platform reduces AI deployment timelines from months to minutes.
Experts agree that the WEKA and NVIDIA collaboration marks a critical step in industrializing enterprise AI, addressing scalability challenges and enabling rapid, cost-effective deployment of AI Factories.
WEKA and NVIDIA Launch Platform to Industrialize Enterprise AI
SAN JOSE, CA – March 16, 2026 – A significant barrier for enterprises looking to capitalize on artificial intelligence has been the leap from a successful experiment to a profitable, large-scale operation. Addressing this challenge head-on, AI data platform company WEKA, in a deep collaboration with NVIDIA, today announced the general availability of its NeuralMesh™ AI Data Platform (AIDP). Unveiled at NVIDIA's GTC 2026 conference, the turnkey solution aims to slash AI deployment timelines from months to minutes, providing the critical infrastructure for what the industry is calling "AI Factories."
The Dawn of the AI Factory
The concept of the "AI Factory" has rapidly moved from a futuristic buzzword to a strategic imperative for large enterprises. Coined and popularized by industry leaders like NVIDIA, it describes a systematic, industrial-grade pipeline for producing intelligence. Much like a physical factory transforms raw materials into finished goods, an AI Factory ingests vast quantities of raw data, processes it, and outputs trained, optimized, and deployable AI models.
This new paradigm is essential for operationalizing advanced AI, particularly "agentic AI"—systems that can autonomously plan, reason, and execute complex, multi-step tasks. However, building these factories presents immense infrastructural challenges. Traditional data storage and management systems often buckle under the pressure, creating bottlenecks that stall projects in the proof-of-concept (POC) stage.
"Enterprises are now deploying AI Factories internally, driving a major shift to inference throughout the ecosystem," said Liran Zvibel, cofounder & CEO at WEKA, in a statement. "These companies require rapid AI outcomes and need turnkey solutions that come with the enterprise table-stakes of reliability, security, and optimal price-performance and cost-effectiveness."
An Architecture Built for Scale
WEKA's NeuralMesh platform is engineered to solve this scalability problem with a fundamentally different approach. Built on a foundation of over 170 patents, its software-defined, adaptive mesh architecture is designed to get faster and more resilient as data volumes grow into the exabyte range—a direct inversion of legacy systems that tend to slow down under load.
A key innovation is the platform's Augmented Memory Grid™. This technology directly tackles the "memory wall" that often limits GPU performance during AI inference tasks. By creating a vast, shared memory pool, it allows AI models to maintain context without constantly re-loading data, a critical function for responsive agentic AI. WEKA claims customers using this feature can achieve 6.5 times more tokens per GPU for inference workloads, a massive efficiency gain that directly impacts the economic viability of large-scale AI.
"The missing piece in production AI isn't reasoning models or compute power. It's having an efficient platform that unifies the AI Factory pipeline and makes it truly scalable," noted Shimon Ben-David, CTO at WEKA. "The NeuralMesh AIDP was designed to close AI's production and profitability gap, taking enterprise experiments to full-scale operations."
A Turnkey Ecosystem Powered by NVIDIA
The NeuralMesh AIDP is not a standalone product but the centerpiece of a tightly integrated ecosystem, with NVIDIA at its core. The platform is based on the NVIDIA AI Data Platform reference design, a blueprint that combines NVIDIA's accelerated computing with enterprise storage to create a high-performance data backbone for AI.
This integration means the WEKA solution is optimized to work seamlessly with NVIDIA's latest technologies, including its RTX GPUs, BlueField-3 DPUs for data processing, and Spectrum-X Ethernet networking for high-speed connectivity. This deep collaboration provides what NVIDIA describes as a critical component for next-generation AI.
"The deployment of agentic AI in production demands a new focus on managing the continuous, coherent flow of data and inference context," explained Jason Hardy, vice president of storage technologies at NVIDIA. "By leveraging the NVIDIA AI Data Platform, solutions like WEKA's NeuralMesh AIDP deliver the persistent context tier necessary for stable and high-scale agentic inference."
The turnkey nature of the platform is further strengthened by pre-integrations with other key enterprise technology providers. Hardware options are available from partners like Supermicro, while software and orchestration layers are supported by Red Hat and Spectro Cloud, simplifying deployment in complex corporate IT environments. This approach is intended to eliminate months of custom integration work, allowing teams to focus on generating business value rather than managing infrastructure.
"By using the NeuralMesh AI Data Platform with Red Hat AI Enterprise, based on Red Hat OpenShift, organizations can run data-intensive AI pipelines across on-premises and cloud environments at the scale that enterprise production demands," said Ryan King, vice president, AI and Infrastructure Partners at Red Hat.
From Financial Markets to Drug Discovery
The ultimate goal of this powerful infrastructure is to accelerate real-world outcomes across industries. The platform comes with ready-to-use pipelines for a range of business applications, moving beyond abstract technology to tangible solutions.
In Health & Life Sciences, it can accelerate complex workflows like cryo-electron microscopy and help researchers identify patient subgroups across massive datasets, potentially speeding up drug discovery. For Financial Services, the platform enables early market signal detection by processing data in real-time and can create secure, institutional knowledge bases. Public Sector agencies can use it to detect threats based on contextual meaning rather than simple keywords, while robotics applications can shorten the loop from data capture to model retraining, improving fleet performance.
The launch places WEKA in a competitive and rapidly evolving market. Other major storage and infrastructure players, including Pure Storage, NetApp, and Dell, are also aggressively pursuing the AI Factory model, many through their own partnerships with NVIDIA. This market-wide shift underscores the industry's consensus that the future of AI is not in isolated experiments, but in scalable, repeatable, and profitable production systems. The NeuralMesh AI Data Platform, delivered as a complete appliance-style system, is WEKA's bid to provide the definitive engine for that future.
📝 This article is still being updated
Are you a relevant expert who could contribute your opinion or insights to this article? We'd love to hear from you. We will give you full credit for your contribution.
Contribute Your Expertise →