78% of Enterprises Now Run AI Inference as Core Operations, F5 Report Finds
Event summary
- 78% of enterprises now run AI inference as a core operation, per F5's 2026 State of Application Strategy Report.
- 93% of organizations operate across multiple clouds, with 86% distributing applications across hybrid multicloud environments.
- 88% of organizations have faced AI-related security challenges, and 98% are preparing for agentic AI.
- 29% of organizations identify prompt layers as the top delivery mechanism for AI workloads.
- F5's report highlights the shift from AI experimentation to operational reality, with an average of seven AI models in production per organization.
The big picture
The rapid operationalization of AI marks a significant shift in enterprise technology, with organizations treating AI inference as a core operational workload. This trend is compounded by the permanence of hybrid multicloud environments, which demand advanced routing, fallback, and policy controls to optimize cost, accuracy, and availability. As AI systems enter full-scale production, security and governance have become systemic requirements, reshaping the enterprise technology landscape.
What we're watching
- AI Governance
- How enterprises will manage the operational governance of AI systems as inference becomes a policy-driven workload.
- Multicloud Complexity
- Whether organizations can effectively manage the complexities of hybrid multicloud environments while maintaining seamless integration and consistent policy enforcement.
- Security Evolution
- The pace at which enterprises will adapt to the evolving security perimeter, focusing on prompt, token, and identity layers.
Related topics
