The AI Investment Paradox: Why Billions Spent Fail to Yield Returns

📊 Key Data
  • Only 32% of organizations report a positive ROI from their AI initiatives.
  • 66% of AI budgets are consumed by underlying infrastructure (data, storage, processing).
  • 50% of a typical organization's cloud storage bill is allocated to fees, not raw storage capacity.
🎯 Expert Consensus

Experts conclude that the failure of many AI projects to deliver ROI stems from inadequate data infrastructure, high cloud costs, and security vulnerabilities, rather than flaws in AI software itself.

about 2 months ago
The AI Investment Paradox: Why Billions Spent Fail to Yield Returns

The AI Investment Paradox: Why Billions Spent Fail to Yield Returns

BOSTON, MA – March 03, 2026 – A major paradox is emerging at the heart of the artificial intelligence revolution: while companies are aggressively increasing their spending on AI, a staggering two-thirds of these projects are failing to deliver a positive return on investment. A new industry report suggests the problem lies not with the AI software itself, but with a far more fundamental and often overlooked layer: the data infrastructure struggling to support it.

The fourth annual Wasabi Global Cloud Storage Index, a comprehensive study based on a global survey of 1,700 IT leaders, reveals that only 32% of organizations report a positive ROI from their current AI initiatives. Despite this low success rate, the corporate appetite for AI remains insatiable, with 60% of companies planning to increase their AI infrastructure budgets and another 37% committed to maintaining their current spending levels.

This disconnect between heavy investment and meager returns is forcing a critical re-evaluation of where AI budgets are being allocated and exposing foundational cracks in data storage, cost management, and security that threaten to derail the promise of AI.

The Infrastructure Bottleneck

In a surprising reversal of typical cloud market dynamics, the report, developed by Wasabi Technologies in collaboration with Vanson Bourne, found that the bulk of AI spending is not going toward sophisticated AI software or SaaS platforms. Instead, approximately two-thirds (66%) of AI budgets are being consumed by the underlying infrastructure—the data, storage, and processing power required to train and operate AI models.

“When we look at revenue allocations at the highest level of the public cloud services market – the vast majority comes from software/SaaS, not infrastructure services (IaaS),” said Andrew Smith, director of strategy and market intelligence at Wasabi Technologies. “But emerging AI workloads and initiatives are actually changing this dynamic. What’s fascinating to see within our survey results this year is how most AI budget allocation is going toward infrastructure, not SaaS.”

This shift underscores the reality that successful AI is fundamentally a data-intensive endeavor. The top challenges cited by organizations attempting to implement AI were not related to algorithms or a lack of software, but to data storage (including cost, access, and management) and data quality (including cleansing and preparation). These two factors outranked even the high cost of specialized compute power, highlighting that before an AI model can learn anything, its data must be affordably stored, efficiently accessed, and meticulously prepared.

Cloud's Hidden Toll on AI Ambitions

While organizations are pouring money into infrastructure, many are finding their budgets drained by the complex and often unpredictable pricing models of major public cloud providers. For the fourth consecutive year, the Cloud Storage Index found that a shocking 50% of a typical organization's cloud storage bill is allocated to fees, not the raw storage capacity itself.

These fees, which include charges for data egress (moving data out of the cloud), API requests, and data retrieval, are particularly punishing for AI workloads that require constant data access and movement. The result is a severe and persistent problem with budget overruns. The study revealed that nearly half (49%) of all respondents exceeded their budgeted spending for cloud storage in the past year, with 91% citing fee-related charges as a contributing factor.

This "fee problem" creates a significant financial drag on AI projects, eroding potential ROI before it can even be realized. As organizations scale their AI initiatives, the volume of data they must ingest, process, and retrieve grows exponentially, compounding these hidden costs and making financial forecasting a significant challenge.

A Widening Security and Trust Gap

Beyond the financial strain, the report uncovers alarming vulnerabilities in data security within public cloud environments. A concerning 44% of organizations reported experiencing a cyberattack that resulted in the loss of access to their public cloud data. This high rate of disruption points to a critical security gap that is particularly dangerous in the context of AI.

Compounding the issue, 41% of IT leaders stated that their public cloud vendor does not provide the necessary tools and features needed to adequately mitigate against these cyberattacks. This perceived inadequacy creates a crisis of confidence at a time when organizations are entrusting their most valuable and sensitive data to the cloud to fuel their AI models.

For AI, the implications of a security breach are catastrophic. It's not just about data loss; it's about the integrity of the entire system. A successful attack could lead to data poisoning, where training data is maliciously altered to corrupt an AI model's output, or the theft of proprietary models that represent millions of dollars in research and development.

The Hybrid Compromise

In response to the intertwined challenges of cost, performance, and security, a clear trend has emerged: organizations are turning to hybrid storage solutions. According to the report, 64% of businesses are now deploying a mix of on-premises and public cloud storage to support their AI workflows.

This hybrid approach allows organizations to strategically place data where it makes the most sense. For instance, sensitive data or data requiring ultra-low latency for real-time model inference might be kept on-premises, while the vast, scalable, and globally accessible public cloud is used for other parts of the AI data pipeline. The survey specifically found that public cloud storage is a preferred choice for the bookends of the AI workflow: initial data retrieval and ingest, and long-term model retention and archiving.

While this strategy offers flexibility and control, it also introduces its own layer of complexity in managing a distributed data landscape. However, it signals a mature understanding that a one-size-fits-all, public-cloud-only approach may not be viable for the demanding and nuanced requirements of enterprise-grade AI.

“As organizations scale AI initiatives, they face mounting data storage and data quality challenges that can quickly erode ROI if not managed effectively,” said Dave Friend, founder and CEO at Wasabi Technologies. The report's findings make it clear that the path to profitable AI is paved not just with advanced algorithms, but with a smarter, more cost-effective, and secure approach to the data infrastructure that forms its foundation.

Product: Cryptocurrency & Digital Assets ChatGPT
Sector: AI & Machine Learning Fintech Cloud & Infrastructure
Theme: AI Governance Generative AI Cloud Migration Artificial Intelligence
Event: Policy Change Acquisition Private Placement
Metric: EBITDA Revenue Net Income
UAID: 19230