Mirantis, Inc.

https://www.mirantis.com/

Mirantis, Inc. is a B2B open-source cloud computing software and services company headquartered in Campbell, California, USA. The company's mission is to empower developers and innovators by automating the discovery, integration, and operation of cloud and open-source technologies, enabling organizations to achieve digital self-determination through control over their strategic infrastructure. Mirantis focuses on providing AI infrastructure management platforms, as well as container and cloud infrastructure management platforms built on Kubernetes and OpenStack.

Mirantis offers a comprehensive suite of products and services designed to manage and operate virtual machines, containers, Kubernetes, and cloud environments. Key products include Mirantis Container Cloud, Mirantis Kubernetes Engine (formerly Docker Enterprise), Mirantis Container Runtime, Mirantis OpenStack for Kubernetes, and Mirantis k0rdent Enterprise. The company also provides Lens, a popular Kubernetes IDE, and has expanded its offerings through acquisitions such as amazee.io (an application delivery platform) and Shipa (an application management framework). Mirantis serves a diverse range of market segments, including large enterprises, the public sector, financial services, and other highly regulated industries, with a strong emphasis on AI/ML infrastructure solutions.

Alex Freedland serves as the CEO of Mirantis. Recent leadership appointments in June 2025 include Richard Borenstein as Senior Vice President of Business Development and Jerry Ibrahim as Chief Technology Officer, Go-to-Market. The company has been actively focusing on AI infrastructure, with recent news in March and April 2026 highlighting its k0rdent AI offerings, partnerships with NVIDIA, and the introduction of Lens Agents for AI agent governance. Mirantis is recognized for its commitment to open-source, multi-cloud, and vendor-neutral solutions, and was named a Leader in the Omdia Universe: Container Management Products, 2024-25.

Latest updates

Lens Expands into AI Agent Governance, Challenging Cloud Vendor Lock-in

  • Lens, a Mirantis company, launched Lens Agents, a platform for governing AI agents across enterprise systems.
  • Lens Agents connects AI agents (including tools like Claude, Cursor, and Copilot) to enterprise systems with policy controls.
  • The platform is currently in early access and builds on Lens’ existing Kubernetes IDE, used by over 1 million developers.
  • Lens Agents introduces features like sandboxed execution, server-side credential injection, and active cost controls.
  • The move represents a strategic expansion for Lens beyond its core Kubernetes IDE offering.

Lens' entry into AI agent governance directly challenges the prevailing model where cloud providers control AI execution within their environments. This move reflects a growing enterprise desire for greater autonomy and control over AI deployments, particularly as AI agent usage expands beyond centralized IT departments. The platform’s success will depend on its ability to offer a compelling alternative to vendor lock-in and address the increasing complexity of managing distributed AI workloads.

Governance Dynamics
The adoption rate of Lens Agents will hinge on whether enterprises prioritize centralized AI agent governance over the convenience of decentralized deployments.
Regulatory Headwinds
The platform’s alignment with emerging AI regulations, such as the EU AI Act, will be a key factor in its appeal to compliance-focused organizations.
Execution Risk
Lens’ ability to integrate with diverse AI agent frameworks and enterprise systems will determine the platform’s overall utility and scalability.

Mirantis Subsidiary Launches Managed AI Agent Platform Amid Data Sovereignty Concerns

  • amazee.ai, a Mirantis company, launched amazeeClaw, a managed hosting platform for OpenClaw AI agents.
  • amazeeClaw offers region-locked deployments across the U.S., Europe, and Australia, emphasizing data sovereignty.
  • The platform provides dedicated container isolation and enterprise-grade compliance (ISO 27001 and SOC 2 Type 2).
  • amazeeClaw is available now with a free trial, and a webinar is scheduled for April 29, 2026.
  • Mirantis acquired amazee.io in 2022, integrating it as a subsidiary, amazee.ai.

The launch of amazeeClaw reflects a growing market need for secure and compliant AI agent deployment solutions, as organizations grapple with the operational and regulatory challenges of moving AI initiatives from experimentation to production. This offering leverages Mirantis’s Kubernetes expertise and aligns with the broader trend of cloud-native architectures for AI, but also highlights the increasing importance of data residency and sovereignty in the AI landscape. The move also signals a strategic focus on specialized AI infrastructure services, a segment attracting significant investment.

Governance Dynamics
Increased demand for data sovereignty solutions will likely accelerate adoption of platforms like amazeeClaw, but also invite scrutiny from regulators regarding data handling practices.
Competitive Landscape
The success of amazeeClaw will depend on its ability to differentiate from existing managed Kubernetes services and other AI agent deployment platforms, particularly in terms of specialized compliance features.
Execution Risk
Mirantis’s ability to scale amazeeClaw’s infrastructure and support its customer base will be critical, given the complexities of managing isolated, region-locked AI agent deployments.

Mirantis Automates AI Factory Deployments with NVIDIA Integration

  • Mirantis is integrating NVIDIA Run:ai into its k0rdent AI platform to automate AI factory deployments.
  • The integration aims to reduce AI platform deployment time from weeks to minutes.
  • k0rdent AI automates the deployment of numerous components, including GPU operators, networking, and resource allocation.
  • The integration supports air-gapped deployments for regulated industries and sovereign AI environments.
  • Mirantis and NVIDIA have validated the integration through over 100 functional tests, achieving partner-certified status.

The increasing demand for private AI factories is creating a bottleneck in the operationalization of GPU infrastructure. This integration addresses a critical pain point for organizations struggling to move beyond GPU procurement to actual AI workload execution. By automating the complex deployment and lifecycle management of NVIDIA Run:ai, Mirantis is positioning itself to capitalize on the growing need for streamlined AI infrastructure solutions, particularly among neocloud providers and enterprises operating in regulated environments.

Adoption Rate
The speed at which enterprises adopt Mirantis's automated AI factory deployments will determine the platform's market penetration and Mirantis's ability to capture a significant share of the growing AI infrastructure market.
Competitive Landscape
The emergence of similar automated AI factory solutions from competitors will likely intensify, potentially eroding Mirantis's first-mover advantage and requiring continuous innovation.
Scalability
How effectively Mirantis can scale its k0rdent AI platform to support increasingly complex and large-scale AI workloads will be critical for maintaining its value proposition and attracting enterprise clients.

Mirantis Bolsters OpenStack with AI Assistant, Energy Tracking

  • Mirantis released MOSK 26.1 on March 26, 2026, a major update to its OpenStack for Kubernetes platform.
  • The update includes an AI assistant for documentation and operational guidance, aiming to reduce troubleshooting time for cloud operators.
  • MOSK 26.1 introduces energy consumption tracking and workload-cost correlation capabilities, crucial for GPU-intensive environments.
  • Significant networking enhancements were added, including expanded OVN support, VPNaaS, QoS, and SR-IOV.
  • SBOMs in CycloneDX format are now included, enhancing software supply chain visibility and compliance.

Mirantis's update reflects the growing demand for automation and efficiency in managing complex cloud infrastructure, particularly as AI and GPU workloads become more prevalent. The focus on energy consumption tracking signals a broader industry trend towards sustainability and cost optimization. The inclusion of SBOMs underscores the increasing importance of software supply chain security and compliance in a risk-sensitive environment.

Adoption Rate
The effectiveness of the AI assistant will hinge on operator adoption and integration into existing workflows; slow uptake could limit its impact on operational efficiency.
Energy Costs
The ability to track energy consumption and correlate it with workload costs will become increasingly important as GPU cluster deployments expand and energy prices fluctuate.
Security Scrutiny
The inclusion of SBOMs will likely increase scrutiny of Mirantis’ software supply chain, particularly given the platform’s use in regulated industries.

Mirantis Integrates NVIDIA Controller to Expedite AI Cloud Deployments

  • Mirantis announced support for NVIDIA’s NCX Infra Controller, an open-source technology for AI cloud platforms.
  • The integration is part of Mirantis’ broader k0rdent AI platform, which aims to transform NVIDIA’s AI infrastructure building blocks into production-ready cloud platforms.
  • Mirantis has partnered with Netris, Supermicro, and VAST Data to automate various aspects of AI infrastructure deployment and data services.
  • Mirantis is contributing to the NVIDIA NCX Infra Controller open-source project and participating in NVIDIA’s AI Cloud Ready Initiative.

Mirantis is positioning itself as a key player in the emerging AI infrastructure market, capitalizing on the trend of cloud providers seeking composable, open-source solutions. The company’s focus on Kubernetes-native automation and its collaboration with NVIDIA represent a strategic shift towards specialized AI cloud platforms, moving beyond traditional virtualization-first architectures. This move is particularly relevant as the demand for GPU resources and AI-powered applications continues to surge, creating a competitive landscape for infrastructure providers.

Ecosystem Adoption
The success of k0rdent AI hinges on broader adoption by infrastructure providers and the ability to attract and retain partners like Netris and Supermicro, demonstrating a viable market for Mirantis’ approach.
Open Source Contribution
Mirantis’ commitment to open-source contributions, particularly with NVIDIA’s components, will determine its influence on the direction of AI infrastructure development and its ability to attract developer talent.
Performance Validation
The effectiveness of the NVIDIA AI Cloud Ready Initiative and Mirantis’ validation process will be crucial in establishing k0rdent AI’s credibility and attracting enterprise customers seeking reliable, production-grade AI infrastructure.

Lens Integrates AI Coding Assistant Protocol, Expanding Kubernetes Workflow

  • Lens, a Kubernetes IDE by Mirantis, launched a built-in Model Context Protocol (MCP) server in Lens Desktop.
  • Lens Desktop has over 1 million users globally, making it the most widely adopted Kubernetes IDE.
  • The MCP server simplifies integration of AI coding assistants like Claude Code, ChatGPT, and GitHub Copilot with Kubernetes clusters.
  • This feature builds upon Lens Prism, the existing AI assistant for Kubernetes troubleshooting.

The integration of AI coding assistants into Kubernetes workflows is a growing trend, driven by the increasing complexity of cloud-native development. Lens’s move to standardize this connection via MCP addresses a significant pain point for developers, potentially solidifying its position as a central hub for Kubernetes management. However, the long-term value depends on the broader adoption of MCP and the ability of Lens to maintain a competitive edge in a rapidly evolving landscape.

Ecosystem Adoption
The success of Lens’s MCP server hinges on broader adoption by AI coding assistant developers; limited uptake would diminish its strategic value.
Competitive Response
Other Kubernetes management tools will likely follow suit, potentially commoditizing the MCP integration and requiring Lens to differentiate on other features.
Security Implications
Increased AI integration within Kubernetes workflows introduces new security considerations, and Lens’s desktop-based credential management will be scrutinized for vulnerabilities.

Mirantis Joins NVIDIA Initiative to Streamline AI Cloud Deployments

  • Mirantis has become a founding ISV partner in NVIDIA’s AI Cloud Ready Initiative.
  • The initiative aims to accelerate the deployment of scalable AI cloud infrastructure for NVIDIA Cloud Partners (NCPs).
  • Mirantis’ k0rdent AI platform automates multi-tenant AI cloud deployments, improving GPU utilization and economics.
  • k0rdent AI integrates with NVIDIA NCX Infra Controller, leveraging NVIDIA’s internal GPU fleet management technology.
  • Gartner projects worldwide AI spending to total $2.5 trillion in 2026.

The partnership addresses a growing pain point for AI infrastructure providers: the need to maximize GPU utilization and operational efficiency amidst surging demand. Many providers are struggling with manual provisioning and fragmented tooling, hindering their ability to capitalize on the $2.5 trillion AI spending projected by Gartner. Mirantis’ k0rdent AI, by automating deployment and leveraging NVIDIA’s internal technology, aims to unlock the economic potential of GPU infrastructure for these providers.

Market Adoption
The success of the NVIDIA AI Cloud Ready Initiative hinges on the adoption rate among NCPs; slow uptake could limit Mirantis’ growth potential.
Competitive Landscape
While Mirantis positions itself as the only open, composable infrastructure platform, competition from other ISVs and NVIDIA’s own solutions will likely intensify.
Technical Integration
The long-term value of the partnership will depend on the seamlessness of integration between k0rdent AI and NVIDIA’s evolving hardware and software stack.

Mirantis Integrates Netris for Automated AI Infrastructure

  • Mirantis and Netris have integrated their platforms to automate Kubernetes cluster delivery and network automation for AI infrastructure.
  • The integration focuses on enabling hard tenant isolation and simplifying network provisioning, addressing key operational bottlenecks.
  • Netris is the first networking orchestration platform to integrate NVIDIA BlueField DPUs into the data center network fabric.
  • The combined solution aims to turn GPU clusters into a repeatable, multi-tenant AI cloud product.
  • The integration supports NVIDIA Spectrum-X Ethernet, Quantum-X InfiniBand, and NVLink fabrics.

The convergence of Kubernetes orchestration and network automation is critical for scaling AI infrastructure, as manual provisioning and fragmented networks are increasingly unsustainable. This integration addresses a growing need for automated, multi-tenant GPU cloud solutions, particularly among neoclouds and enterprises deploying AI workloads at scale. The incorporation of NVIDIA BlueField DPUs signals a move towards hardware-enforced isolation and improved efficiency in AI infrastructure.

Scale Challenges
The ability of the integrated platform to maintain performance and isolation as tenant density increases will be a key determinant of its long-term success.
Competitive Landscape
Other Kubernetes orchestration providers will likely respond with similar network automation integrations, intensifying competition in the AI infrastructure management space.
Adoption Rate
The pace at which neoclouds and enterprises adopt this integrated solution will depend on their existing infrastructure investments and the perceived value of the automation benefits.

AI Infrastructure Buildout Accelerates, Challenging Sustainability Concerns

  • Mirantis is partnering with emerging 'neocloud' providers to build and operate AI infrastructure.
  • Neoclouds secure energy access, build compute capacity, and sell AI services, aiming for premium margins.
  • Mirantis CEO Alex Freedland estimates AI infrastructure demand is growing 50x year-over-year, driven by consumer AI and enterprise productivity tools.
  • Neoclouds can generate up to $4 billion annually from the same 100 megawatts of power by packaging infrastructure as AI services.
  • Freedland predicts hybrid consumption models will become standard, with enterprises using hyperscalers, neoclouds, and potentially sovereign providers.

The emergence of 'neoclouds' signals a significant shift in the AI infrastructure landscape, moving beyond the traditional hyperscaler model towards a more vertically integrated and geographically distributed approach. This buildout, fueled by exponential demand from both consumer and enterprise applications, is challenging conventional wisdom about sustainability and highlighting the growing strategic importance of energy and data sovereignty. The economics of AI infrastructure are rapidly evolving, with providers capturing increasing value by moving up the service stack.

Governance Dynamics
The extent to which national and industrial sovereignty requirements will continue to drive the proliferation of neocloud providers and fragment the AI infrastructure landscape remains to be seen.
Energy Constraints
Whether neocloud providers can secure sufficient and cost-effective energy resources to sustain their rapid expansion, particularly given broader geopolitical and climate-related pressures, will be a critical factor.
Market Structure
The pace at which neoclouds can move up the value chain, from raw infrastructure to packaged AI services, will determine their ability to achieve the projected profitability and challenge the dominance of existing hyperscalers.

Mirantis Validates Supermicro Integration to Automate GPU Cloud Deployments

  • Mirantis has validated its k0rdent AI platform's compatibility with Supermicro’s modular server architecture.
  • The validation aims to automate operations for sovereign AI and hybrid GPU cloud environments.
  • Texas Tech University System is using the combined technology to build a high-performance computing cluster.
  • The solution eliminates manual provisioning workflows for Supermicro nodes, integrating them into Kubernetes infrastructure.

The announcement reflects the growing complexity of AI infrastructure management and the increasing demand for automated, full-stack solutions. Organizations are struggling to operationalize AI at scale, leading to a shift towards platforms that simplify GPU deployment and ensure efficient resource utilization. This partnership positions Mirantis to capitalize on the burgeoning market for sovereign AI and hybrid GPU cloud deployments, a segment driven by both enterprise needs and geopolitical considerations.

Sovereign Tech
The adoption rate of sovereign AI infrastructure solutions will be a key indicator of geopolitical influence and data localization trends, as governments increasingly mandate control over AI deployments.
Integration Risk
The success of this partnership hinges on the seamless integration of Mirantis’ software with Supermicro’s hardware, and any integration challenges could delay deployments and impact customer satisfaction.
Competitive Landscape
The emergence of specialized AI infrastructure platforms like Mirantis k0rdent will intensify competition among cloud providers and infrastructure vendors, potentially driving down margins and accelerating consolidation.

Mirantis, VAST Data Partner to Streamline AI Infrastructure Orchestration

  • Mirantis has joined VAST Data’s Cosmos Partner Program as a Technology Partner.
  • The collaboration aims to standardize integration of VAST technologies within Mirantis’ k0rdent AI ecosystem.
  • The partnership focuses on enabling neoclouds to reduce integration friction and accelerate AI services delivery.
  • The initiative aligns with broader ecosystem efforts, including NVIDIA-led reference architectures for GPU environments.

The partnership reflects a growing trend towards standardized, modular AI infrastructure stacks, driven by the need to optimize expensive GPU resources and accelerate AI service delivery. Neoclouds, increasingly reliant on GPU-intensive workloads, are seeking solutions to streamline operations and reduce integration complexity, creating a significant market opportunity for vendors like Mirantis and VAST Data. This collaboration signals a shift away from custom-built AI infrastructure towards a more orchestrated, vendor-agnostic approach.

Integration Speed
The success of this partnership hinges on the speed and efficiency with which Mirantis can integrate VAST’s technologies into its k0rdent platform; delays could hinder adoption and impact both companies’ revenue projections.
Neocloud Adoption
Widespread adoption by neoclouds will be critical for realizing the partnership's strategic goals, and the willingness of these customers to shift from bespoke stacks to standardized building blocks remains a key uncertainty.
Ecosystem Alignment
The degree to which Mirantis and VAST Data can coordinate with NVIDIA and other ecosystem players will determine the overall impact and market reach of their joint solutions.

Mirantis Backs Agentic AI Foundation Amid Model Context Protocol Shift

  • Mirantis has joined the Linux Foundation's Agentic AI Foundation (AAIF) as a Silver Member.
  • The AAIF was recently formed to foster open-source agentic AI development.
  • Mirantis launched its MCP AdaptiveOps framework on September 30, 2025, to manage Model Context Protocol (MCP) servers.
  • Mirantis' k0rdent AI offering provides access to private LLMs for enterprises.
  • MCP governance is transitioning to the open-source community.

The formation of the Agentic AI Foundation signals a growing recognition of agentic AI's potential to transform industries. Mirantis's involvement, particularly with its MCP AdaptiveOps framework, suggests a strategic focus on simplifying the deployment and management of Model Context Protocols, a critical element for interoperability in the evolving AI landscape. This move positions Mirantis to capitalize on the increasing demand for specialized AI infrastructure solutions as enterprises seek to leverage autonomous AI systems.

Governance Dynamics
The shift of MCP governance to the open-source community could accelerate development but also introduce challenges in standardization and commercial adoption.
Ecosystem Adoption
The success of the AAIF hinges on attracting broader participation and establishing a shared ecosystem of tools and standards for agentic AI.
Competitive Landscape
Mirantis's positioning within the AAIF and its MCP AdaptiveOps framework will be tested against other players vying for dominance in the emerging agentic AI infrastructure market.
CID: 317