dig Launches AI Search to Decode Social Video's 'Authenticity Gap'

📊 Key Data
  • $22 million raised in total funding, including a $14 million Series A round in August 2025
  • $100/month entry-level paid plan, contrasting with competitors' prices ranging from $800 to $15,000/month
  • Over half of all social media content was video in 2025, highlighting the 'authenticity gap'
🎯 Expert Consensus

Experts would likely conclude that dig's AI search platform addresses a critical gap in social intelligence by leveraging video content for more authentic, real-time insights, challenging traditional text-based analysis tools.

1 day ago

dig Launches AI Search to Decode Social Video's 'Authenticity Gap'

WILMINGTON, Del. – April 16, 2026 – As the internet's town square becomes increasingly dominated by video, the ability to understand what people are truly saying, feeling, and thinking has become a monumental challenge. Today, social intelligence firm dig launched a new platform, ask-dig, aiming to solve this very problem. Billed as the first video-centric AI social search platform, it promises to deliver evidence-backed answers sourced directly from the unfiltered world of social media video.

The new tool allows any user, from a journalist to a marketing professional, to ask a plain-language question and receive a summary of social sentiment and narratives grounded in actual video content. This launch directly challenges the current generation of Large Language Models (LLMs) and traditional social listening tools, which primarily rely on text and often fail to capture the nuance and authenticity of video-driven conversations.

The Authenticity Gap in a Video-First World

The digital landscape has fundamentally shifted. Short-form video is no longer just a feature on social platforms; it is the dominant format, with some industry reports suggesting it constituted over half of all social media content in 2025. This explosion of user-generated video has created what some analysts call an "authenticity gap." While brands, researchers, and the public are eager to understand emerging trends and opinions, the tools at their disposal are often ill-equipped for the task.

Traditional LLMs, despite their confident-sounding answers, are typically trained on vast but often outdated corpuses of web text. They lack real-time access to the dynamic, ephemeral conversations happening inside social media videos. Similarly, legacy social listening platforms have historically been built around keyword tracking and text-based mention analysis. This leaves them effectively blind to the rich context embedded in video—the tone of voice, visual cues, emotional reactions, and cultural sarcasm that define modern online discourse.

"Most AI platforms today produce answers that sound authoritative, but lack grounding in real-world sentiment and behavior," said Ofer Familier, CEO and Co-founder of dig, in the company's announcement. "The problem is that opinions are forming in conversations taking place through dynamic interactions across social, increasingly in videos. When that richness is flattened into clean text for analysis, critical context is lost. ask-dig was built to bridge that gap."

How Video-Centric AI Aims to Find the Truth

ask-dig positions itself not just as another search engine, but as a real-time answer engine for the social web. The process begins when a user types a question, such as "What do people in Chicago think of the new transit policy?" or "What are the emerging fashion trends among Gen Z creators?" In under two minutes, the platform scours social media, pulling in thousands of relevant video posts.

From there, its proprietary AI gets to work. Instead of just transcribing words, it performs what is essentially multimodal analysis, examining the spoken narrative, sentiment expressed in comments, and audience reactions. The system is designed to cut through the noise of sponsored posts and synthetic content to focus on genuine, user-generated reactions. The company claims its in-house LLMs can even account for complex human expressions like humor and sarcasm, reducing the false positives that plague many sentiment analysis tools.

The key differentiator is its commitment to verifiable evidence. Unlike a standard LLM that generates a probabilistic response from its training data, every insight from ask-dig is sourced exclusively from the social content it collects for that specific query. Crucially, every answer is source-linked, allowing users to click through and view the original videos and posts that informed the conclusion. This traceability is designed to build trust and empower users to conduct their own verification, a feature particularly valuable for journalists and market researchers.

Reshaping the Market for Social Intelligence

With its launch, dig is making a strategic play to democratize access to high-level social intelligence. The platform is launching with both a free tier and an entry-level paid plan at $100 per month. This pricing structure places it in stark contrast to the dominant enterprise-grade social listening tools, which often carry hefty price tags ranging from $800 to over $15,000 per month and require annual contracts.

Competitors like Brandwatch, Sprout Social, and Synthesio offer powerful, feature-rich suites for social media management and consumer intelligence, but their cost can be prohibitive for individual creators, freelance journalists, or small agencies. By offering an accessible entry point, ask-dig targets a broad segment of the market that needs fast, reliable social insights without the complexity and cost of a full enterprise solution.

The potential use cases are extensive. A brand manager can get real-time feedback on a new product launch. A content creator can quickly identify the next viral trend or challenge. A journalist can map local reactions to a breaking news story, gathering authentic voices directly from the scene. This self-serve chat experience is designed to be intuitive, turning the complex task of social analysis into a simple question-and-answer process.

This tool also serves as a gateway to dig's broader product suite. While ask-dig is built for ad-hoc questions, the company's enterprise platform provides continuous brand monitoring, deep narrative intelligence, and reputational risk management, demonstrating a clear path for users to scale up as their needs evolve.

Backed by Capital and a Strategic Pivot

Based in Wilmington, Delaware, with offices in Tel Aviv, New York, and London, dig is not a newcomer to the AI and video space. Founded in 2021 by Ofer Familier, Eyal Koren, and Adi Paz, the company initially focused on helping marketing teams repurpose video content. However, the team soon recognized a more critical need: analyzing the content of social videos to understand disinformation and reputational risk.

This strategic pivot has attracted significant investor confidence. The company has raised a total of $22 million in funding, culminating in a $14 million Series A round in August 2025. The round was co-led by New Era Capital Partners and Osage Venture Partners, with participation from several other venture firms. This financial backing provides the runway to challenge established players and scale its technology.

Even before the launch of ask-dig, the company had built a strong foundation in the enterprise sector, serving global luxury brands, CPG companies, and Fortune 500 tech firms. This experience in handling complex narrative intelligence for major corporations lends credibility to its new, more accessible offering. By tackling the challenge of video-first social intelligence, dig is betting that the future of understanding public opinion lies not in analyzing text, but in authentically interpreting what people are showing and saying.

Theme: Geopolitics & Trade Digital Transformation Generative AI Large Language Models
Event: Funding & Investment Corporate Finance
Sector: AI & Machine Learning Fintech Software & SaaS
Product: ChatGPT
Metric: Revenue

📝 This article is still being updated

Are you a relevant expert who could contribute your opinion or insights to this article? We'd love to hear from you. We will give you full credit for your contribution.

Contribute Your Expertise →
UAID: 26497