Sharon_Barnes
ServiceNow Employee

Making Sense of AI Assistants: A Practical Guide to Assistant Analytics in ServiceNow

As AI-powered assistants become more embedded in ServiceNow workflows, one big question keeps coming up: How do we know if they’re actually working?

That’s exactly what this Platform Academy session tackles. Sharon Barnes and Santanu walk through Assistant Analytics—a centralized dashboard designed to help admins and developers understand usage, performance, and user experience across all conversational AI channels.

 


What Is Assistant Analytics?

Assistant Analytics provides a single pane of glass for tracking interactions across:

  • Now Assist Panel
  • Virtual Agent
  • Voice Assistants

Instead of guessing whether your AI is effective, you can measure:

  • Usage and adoption
  • Customer satisfaction (CSAT) and sentiment
  • Deflection rates
  • Assist consumption

The goal isn’t just visibility—it’s enabling better decisions.


Getting Started: Requirements

To use Assistant Analytics, you’ll need:

  • Zurich Patch 6 or later
  • Now Assist for Platform (v10.0.3+)
  • Virtual Agent Admin role (virtual_agent_admin)

Navigate to:
All > Conversational Interfaces > Assistant Designer > click the tab for Analytics


Key Dashboards and What They Tell You

 

Overview Page

  1. Review high-level metrics showing overall assistant activity across your instance
  2. Monitor assist usage patterns to understand feature adoption
  3. Track overall Customer Satisfaction (CSAT) scores for all assistant interactions

 

Usage Page

  1. Analyze conversation volumes to identify peak usage periods and capacity requirements
  2. Review channel distribution data to understand where users prefer to engage with assistants
  3. Examine conversation outcomes to determine resolution rates and escalation patterns

 

Adoption & Engagement Page

  1. Track user growth metrics to measure assistant adoption over time
  2. Identify engagement patterns that indicate successful user experiences
  3. Monitor assist-to-execution trends to understand how often suggestions lead to actions

 

Sentiment Page

  1. Evaluate user satisfaction scores broken down by assistant, channel, or time period
  2. Measure empathy indicators to assess the emotional quality of assistant responses
  3. Identify frustration signals that may indicate areas requiring immediate attention
  4. Review resolution metrics to confirm users are getting their issues solved

 

Self-Solve Performance Page

  1. Calculate deflection rates to measure how many inquiries are resolved without agent escalation
  2. Analyze effort scores to understand how much work users must do to get answers
  3. Assess self-service effectiveness to identify optimization opportunities

 

Assists Page

  1. Track AI resource consumption to understand computational costs associated with assistant operations
  2. Monitor AI feature usage patterns to identify which capabilities deliver the most value
  3. Implement cost optimization strategies based on actual resource utilization data

 

Voice Page

  1. Evaluate voice assistant performance metrics specific to telephony channels
  2. Measure voice-specific deflection rates to quantify call center cost savings
  3. Review voice satisfaction metrics to ensure audio experiences meet quality standards

 


Important Nuances

  • Custom AI agents are included in analytics tracking.
  • A viewer (role) is coming soon to allow read-only access without full admin permissions.

Turning Insights Into Action

Assistant Analytics isn’t just a reporting tool—it’s a feedback loop.

Use it to:

  • Identify underperforming assistants
  • Identify and reduce unnecessary escalations to live agents
  • Optimize user journeys
  • Justify ROI to stakeholders

The real value comes from acting on the data—not just observing it.


What’s Next?

ServiceNow is actively improving:

  • Deflection accuracy (true end-to-end tracking)
  • Role-based access controls
  • Unified analytics across AI capabilities (including autonomous agents)

Final Thoughts

If you're investing in conversational AI, this dashboard is essential.

It gives you the clarity to answer:

  • Are users adopting it?
  • Is it helping them?
  • Is it reducing workload?
  • Where should we improve next?

And most importantly—it helps you prove that your AI strategy is working.


 

Chapters

0:00 - Introduction & Session Overview
1:17 - Meet the Speakers
3:00 - Agenda & What is Assistant Analytics?
4:39 - Why Assistant Analytics Matters
6:37 - Requirements & Access (Zurich Patch 6+)
7:30 - Demo: Assistant Designer & Overview Dashboard
11:24 - Usage Dashboard Deep Dive
14:29 - Adoption & Engagement Metrics
16:44 - Sentiment, CSAT & User Experience Insights
21:27 - Escalations & Resolution Trends
30:56 - Self-Solve Performance & Deflection Explained
35:53 - Effort Score & Optimization Opportunities
37:32 - Assist Usage & AI Feature Tracking
41:25 - Q&A: Roles, Data Collection, Custom Agents
45:10 - Future Roadmap & AI Analytics Expansion
46:43 - Resources & Documentation
48:01 - Key Takeaways
49:29 - Wrap-Up & Upcoming Sessions