GigaIO Pivots to Edge AI, Sells Datacenter Tech to d-Matrix
- $400 billion: The global edge AI market is projected to grow from under $50 billion in 2026 to nearly $400 billion by 2034.
- 7,300 TFLOPS: A five-unit Gryf SWARM configuration delivers over 7,300 TFLOPS of FP8 performance, rivaling stationary datacenter racks.
- $450 million: d-Matrix is backed by $450 million in total funding, enhancing its financial firepower in the AI inference market.
Experts would likely conclude that this strategic realignment reflects a broader industry trend toward specialization, with GigaIO and d-Matrix optimizing for edge and datacenter AI infrastructure, respectively, to drive innovation and market growth.
GigaIO Pivots to Edge AI After Selling Core Datacenter Assets to d-Matrix
CARLSBAD, Calif. – April 02, 2026 – In a significant strategic realignment shaping the future of artificial intelligence infrastructure, GigaIO has sold its core datacenter technologies to AI inference pioneer d-Matrix. The deal sees GigaIO divest its award-winning SuperNODE™ platform and patented FabreX™ AI fabric, allowing the company to pivot entirely to the burgeoning edge computing market. For d-Matrix, the acquisition marks a major power play, arming it with advanced technology and key engineering talent to intensify its challenge against established leaders in the datacenter AI hardware space.
The transaction is the culmination of a year-long partnership that saw the two companies integrate d-Matrix’s Corsair™ inference platform with GigaIO’s architecture. Now, with the sale complete, both companies are pursuing highly specialized, yet complementary, paths in the rapidly expanding AI ecosystem.
A Decisive Bet on the Edge
GigaIO's move represents a full-throated commitment to what it sees as the next frontier: bringing high-performance computing out of the datacenter and into the field. By shedding its datacenter-focused assets, the company is channeling all its resources into its flagship product, Gryf, a system it bills as the world's first carry-on suitcase-sized AI supercomputer.
This strategic pivot is grounded in explosive market projections. The global edge AI market, which enables data processing closer to its source, is forecast to skyrocket from under $50 billion in 2026 to nearly $400 billion by 2034. This growth is driven by an insatiable demand for real-time analytics and low-latency decision-making in sectors ranging from autonomous systems to industrial IoT.
“The edge market has a huge upside, with increased need to deploy new, modern, meaningful compute closer to the users, and that’s what GigaIO is going to be focused on,” said Alan Benjamin, CEO of GigaIO. “We are excited to rethink the long-standing paradigm of stripped-down capabilities at the edge and instead deliver a new approach with mobile datacenter-class hardware.”
GigaIO aims to capture this market by solving the critical challenges of portability and power. Its Gryf system is designed to deliver computational performance previously confined to climate-controlled server rooms, making it ideal for industries where data needs to be processed on-site and immediately. The company has already reported strong interest from the defense and intelligence communities, sports and media organizations, and the energy sector.
d-Matrix Fortifies its Arsenal for the Datacenter Wars
While GigaIO turns its attention to the field, d-Matrix is digging deeper into the datacenter. The acquisition of GigaIO’s SuperNODE platform, rack-scale engineering talent, and, most critically, the FabreX technology, significantly enhances its capabilities. FabreX is a PCIe Gen 5-based fabric that enables ultra-low latency communication between processors and accelerators, a key ingredient for building massively scalable AI systems.
d-Matrix is positioning itself to challenge what GigaIO's CEO called the “existing hegemony” in the AI hardware market, a clear reference to dominant players like NVIDIA. By integrating FabreX with its own Corsair inference accelerators, d-Matrix can now offer more complete, system-level solutions. The company’s core technology, a proprietary Digital In-Memory Compute (DIMC) architecture, is already designed to overcome memory bottlenecks that can hinder the performance of large language models (LLMs).
With claims of up to 10x faster performance and significantly better energy efficiency than traditional GPU-based systems, d-Matrix is making a compelling case for specialized inference hardware. The acquisition accelerates its roadmap, providing the tools to solve what it calls a “systems problem” that requires efficient data movement across chips, nodes, and entire racks.
Backed by an impressive $450 million in total funding, d-Matrix now has the technological assets and financial firepower to make a serious run at the lucrative AI inference market, a segment projected to be worth over $300 billion by the next decade.
Gryf: A Datacenter in a Suitcase
At the heart of GigaIO's new strategy is the Gryf system, a marvel of engineering that packs staggering performance into a portable, ruggedized chassis. Co-designed with SourceCode, the sub-55-pound unit is TSA-friendly, complete with a handle and wheels, yet it houses the power to run complex AI workloads anywhere.
Its modular design features six internal sleds that can be configured with a mix of high-end components, including AMD EPYC CPUs and powerful NVIDIA H100 or L40S GPUs. This allows users to customize the hardware for specific missions, from real-time video analytics in a broadcast truck to cybersecurity operations in a tactical environment.
The secret sauce enabling this power and flexibility is the same FabreX technology now owned by d-Matrix for datacenter use. GigaIO retains rights for its edge products, using the fabric to disaggregate and compose resources dynamically. This technology allows multiple Gryf units to be interconnected into a “SWARM” configuration. A five-unit SWARM can deliver over 7,300 TFLOPS of FP8 performance, rivaling stationary datacenter racks while remaining fully mobile.
By processing data on-site, Gryf eliminates the latency and security risks of transmitting massive datasets to the cloud. This capability is transformative for applications like live sports analytics, on-location media production, and critical intelligence, surveillance, and reconnaissance (ISR) missions where every millisecond counts.
A Market Dividing to Conquer
The transaction between GigaIO and d-Matrix highlights a broader trend of specialization within the AI industry. As AI becomes more pervasive, a one-size-fits-all hardware approach is proving insufficient. This deal effectively creates two highly focused companies attacking different ends of the infrastructure spectrum: d-Matrix is optimizing for raw scale, efficiency, and performance within the core datacenter, while GigaIO is optimizing for portability, ruggedness, and real-time processing at the distributed edge. This strategic divergence allows both to innovate faster in their respective domains, potentially accelerating the deployment of advanced AI capabilities across all sectors of the economy.
📝 This article is still being updated
Are you a relevant expert who could contribute your opinion or insights to this article? We'd love to hear from you. We will give you full credit for your contribution.
Contribute Your Expertise →