LiveRamp Accelerates AI Clean Rooms with NVIDIA GPU Integration

  • LiveRamp has integrated NVIDIA’s AI infrastructure into its clean room architecture.
  • The upgrade enables AI model training and inference speeds up to 15x faster.
  • The integration allows AI partners to use existing code without rearchitecting for CPU-based environments.
  • LiveRamp’s Marketplace now offers data and models for AI training, alongside AI-powered applications.
  • General availability (GA) of the integration is expected later in 2026.

LiveRamp's move to NVIDIA GPUs signifies a broader trend of AI model training shifting to specialized hardware, driven by the increasing computational demands of AI-driven marketing. This integration addresses a key bottleneck in the data collaboration workflow, allowing for faster iteration and more sophisticated AI applications within a privacy-preserving environment. The partnership positions LiveRamp to capitalize on the growing demand for AI-powered marketing solutions, but also increases its reliance on NVIDIA’s hardware ecosystem.

Adoption Rate
The speed of adoption among LiveRamp’s 900+ brands, publishers, and platforms will determine the immediate impact on revenue and market share.
Competitive Response
Competitors in the data collaboration and clean room space will likely accelerate their own GPU infrastructure investments to remain competitive.
Model Security
The long-term success hinges on LiveRamp’s ability to maintain robust data security and IP protection as the scale and complexity of AI models increase.