Why Traditional Analytics Fail for AI Products
Event analytics was built for clicks. AI products run on conversations. The measurement gap is back — and it costs more than you think.
The Event Analytics Breakthrough
Event analytics solved a real measurement gap. Understanding that history clarifies what is happening now.
The years between 2008 and 2012 saw SaaS applications proliferating and mobile apps emerging. Google Analytics dominated, measuring pageviews, sessions, and time on site. SaaS apps introduced new interaction types: buttons, forms, workflows, in-app actions. The question changed from "how long did they stay?" to "what did they do?"
Web analytics could not answer it.
Product managers who wanted behavioral data had two options: write SQL queries or request engineering support. Both were slow. Engineering had other priorities. Product decisions bottlenecked on data access. Iteration cycles lengthened.
Mixpanel (founded 2009), Amplitude (founded 2012), and others made event tracking accessible. Engineering instrumented elements once with lightweight SDKs. PMs built their own reports, funnels, and cohort analyses. No tickets. No waiting. Self-service analytics.
The impact was measurable. Funnels became visible. Drop-off points became actionable. Product teams ran experiments and saw results directly. PM autonomy increased. Product velocity increased. Companies that adopted event analytics outpaced those that did not.
The growth of these tools tells the story. Mixpanel now serves 8,000 paying customers with an estimated $210M in revenue for 2025. Amplitude went public on NASDAQ with a $1.32B market cap and $329.92M in trailing twelve-month revenue. Event analytics became infrastructure.
The Modality Shift
AI products changed how customers interact with software. The measurement model has not caught up.
ChatGPT normalized natural language as a primary interface. As of December 2025, it reached 900 million weekly active users. Voice-based conversational AI became a widely adopted modality for customer support and other use cases. For a growing category of products, the primary interaction is language, not clicks.
This changes everything about measurement. Customers express intent in natural language. They describe what they want rather than selecting from options. The AI infers meaning and guides toward outcomes. The interaction is generative, not deterministic.
Traditional SaaS products work on designed flows. Product teams build the path: "Click here to start" leads to "Select options" leads to "Confirm." Users follow the designed path or drop off at known points. Funnels map to actual user journeys. Analytics track progression through defined stages.
AI products work differently. Customers arrive with intent and express it however they choose. Ten customers with identical goals will phrase them ten different ways. There is no Step 1 to Step 2 to Step 3. The path emerges from the conversation.
What Event Analytics Cannot Measure
The metrics that matter for AI products do not exist in event analytics. The gap is structural, not incremental.
Intent is the goal the customer is trying to accomplish, expressed in their own words with varying specificity. It is the foundation for measuring everything else. Event analytics captures "Session started," "Message sent," and "Feature accessed." It does not capture what customers are trying to do, how they describe it, or whether the product can serve that intent.
The journey is what happens between stating a goal and reaching an outcome. This is where experience quality reveals itself. Conversations contain signals that indicate how the experience is going: a customer asking "I don't understand, can you explain?" shows confusion. Rephrasing the same request three times shows friction. Sentiment shifting from neutral to negative shows frustration. Event analytics captures message count, session duration, and return visits. It cannot read the signals inside the conversation.
A customer sending 40 messages could be deeply engaged and exploring. Or they could be stuck on a basic task. The message count is identical. The experience is opposite. Event analytics cannot tell the difference. Neither can observability tools.
Outcomes answer two questions: did the customer accomplish their goal, and did they accomplish the goal you designed the product for? Event analytics tracks feature adoption, session completion, and return usage. The problem is that engagement does not equal success. A customer can engage extensively while failing to accomplish anything.
The Engagement Trap
Proxy metrics create danger. Consider two customers with identical intent:
Customer A sends 8 messages, reaches a successful outcome, and maintains positive sentiment throughout. Customer B sends 45 messages, reaches no outcome, and builds frustration across the conversation.
Event analytics interprets Customer B as "more engaged" with "higher usage." Customer B might be flagged as a power user.
The reality is different. Customer B is struggling. Customer B is a churn risk. Customer B may already be evaluating alternatives.
Teams optimize for engagement because that is what they can measure. Engagement metrics reward the wrong behavior in AI products. High message counts might indicate product failure, not product success.
Observability tools like LangSmith will show that the agent performed correctly for both customers. The agent had no errors, no loops, and minimal latency. This leaves product managers without visibility or access to critical context happening inside the products they own.
It costs 5 to 25 times more to acquire a new customer than to retain an existing one. When your measurement tools cannot distinguish success from struggle, the cost compounds.
The Current State
Teams are either flying blind or burning cycles on unsustainable workarounds. Neither is acceptable.
Manual log analysis. PMs export conversation logs. They spend hours reading logs, building pivot tables in spreadsheets, and looking for patterns. The insights are one-off and do not scale. The time is stolen from product work.
SQL queries. PMs with technical skills query raw data directly. They build custom reports for each question. The work is fragile, non-reproducible, and time-intensive. This is the same problem that existed before event analytics.
Engineering requests. PMs file tickets for custom dashboards. Engineering builds one-off reports. They resent the distraction from product development. The process creates organizational friction.
Build the infrastructure. Some teams pipe observability data to a data warehouse. They build a custom labeling and classification layer for topics, intent, and sentiment. They add a BI tool on top for visualization. This takes months of work. The cost starts at $60,000 for small companies and runs higher for enterprises. Engineering hours easily reach hundreds. Most teams abandon the attempt.
Companies are betting growth on AI product success. The stakes are organizational. Executives ask: "Are customers getting value from our AI investment?" There is no clear answer. Product asks: "Where is the experience breaking?" Manual investigation is required. Sales asks: "Which accounts are struggling with the product?" There is no visibility. Customer Success asks: "Who needs proactive intervention?" They discover problems after escalation.
The data exists. It is trapped in logs, traces, and transcripts. The gap between data and decisions is where value disappears.
The Experience Analytics Gap
This is a category gap, not a feature gap. It requires a new measurement approach.
The historical sequence is clear. Web analytics measured pageviews and sessions. Apps introduced in-app actions that web analytics could not track. Event analytics emerged to fill the gap. Event analytics became standard for SaaS products.
The current sequence follows the same structure. Event analytics measures clicks, funnels, and feature engagement. AI products introduced conversational interactions that event analytics cannot track. The gap exists. What fills it?
Experience Analytics. Not a better version of event analytics. Not an observability tool with technical logs. A different measurement approach built for products where interaction is conversational rather than click-based, paths are dynamic rather than predefined, and success is an outcome achieved rather than a button pressed.
Event analytics is to SaaS apps as Experience Analytics is to AI products. The category did not exist because the product type did not require it. Now it does.