The Zurich release has arrived! Interested in new features and functionalities? Click here for more

Ritesh Shah AI
ServiceNow Employee
ServiceNow Employee

Introduction

 

Hello!  We are a group of dedicated AI Strategists and Architects committed to advancing the implementation and adoption of AI solutions for our customers. Through countless advisory and hands-on engagements, we’ve gathered valuable insights and practical guidance that we’re excited to share with the broader ServiceNow community.

 

In our customer conversations, the first thing we get asked is how to identify the best use case for AI Agents.  A global survey (of nearly 4500 executives) by ServiceNow and Oxford Economics found that although agentic AI—autonomous AI that pursues set goals—shows promise for improving productivity, only a third of organizations have begun pilot projects. Among those piloting agentic AI, early benefits are evident. 40% of executives we surveyed are considering adoption in the next year; but aren’t sure which use cases to prioritize. Identifying optimal use cases requires a thoughtful balance of data-driven analysis and anecdotal feedback to determine what best suits each organization.  This article will describe approaches to identify suitable use cases based on our experiences with customers. 

 

 

Spoiler Alert: You Know Your Business Best

 

Here's what we've learned from working with several customers:  while ServiceNow and implementation partners can share industry insights, you’re in the best position to identify use cases that will genuinely move the needle for your business. While external expertise provides valuable context, your deep understanding of internal pain points, workflows, and strategic priorities is irreplaceable.

 

The most successful AI Agent implementations come from organizations that combine external industry knowledge with deep understanding of their unique opportunities. You know where your teams spend too much time on repetitive tasks, where bottlenecks consistently emerge, and which processes frustrate both employees and customers for bottom-line improvements. Most importantly, you understand the opportunities to enhance top-line impact.

 

The most successful AI agent deployments we've observed combine rigorous data analysis with pragmatic business judgment—and they start with humble, measurable wins rather than moonshot transformations.

 

 

The Data-Driven Discovery Framework

 

This section outlines two ways to use data to identify your use cases.

 

Process Mining: Your Strategic Intelligence Layer

The most sophisticated approach to use case identification for AI Agents begins with understanding what happens in your organization—confirming or disproving the anecdotes you have received.

Nearly 70% of pacesetters identify process mining as key to enabling their effort - and for good reason.  Process Mining reveals the gap between documented processes and operational reality such as idle-time analysis, multi-hop analysis, and uncover repetitive issues that can be resolved by AI agents. With the Xanadu release of ServiceNow, Process Mining Evaluation Projects are pre-installed for all customers. Xanadu introduced incident-focused projects, while Yokohama added customer service and HR case projects. Up to 3,600 records can be analyzed using the Process Mining Analyst Workbench.  Refer to this article for more details.

 

 

Group Action Framework (GAF): Pattern Recognition at Scale

Group Action Framework complements process mining by analyzing incident or case clustering patterns. This approach has proven particularly valuable for customer or employee service optimization, where pacesetters report the highest ROI at 52%. The grouping framework identifies clusters of similar records (incident, cases, etc.) and selects a set of representative records for the cluster.  Customers start by building an AI agent to automate resolution for top incident or case clusters reducing the burden on service management resources.  For more information on how to configure GAF, refer to this link.

 

 

The Strategic Workshop Approach: Combining Data with Organizational Intelligence

 

While data provides the foundation for AI agent use case identification, the most effective approach combines quantitative insights with structured organizational intelligence. This hybrid methodology ensures you capture both the measurable inefficiencies and the nuanced business context that only your team understands.

 

Why anecdotal organizational intelligence matters

Data alone can tell you what's happening, but it can't tell you why it matters to your business or how it fits into your strategic priorities. Our most successful customer engagements have shown that combining process mining and GAF insights with cross-functional workshops produces use cases that are both technically feasible and strategically aligned.

 

The workshop approach addresses three critical gaps that purely data-driven approaches miss:

  1. Strategic Context: Understanding which inefficiencies align with business priorities and which are merely operational noise
  2. Organizational Readiness: Assessing the human and cultural factors that will determine implementation success
  3. Hidden Opportunities: Identifying process improvements that may not be visible in current data but represent significant value creation potential

 

Phase 1: Comprehensive Use Case Generation

The first phase focuses on generating a comprehensive inventory of potential AI agent opportunities without immediate filtering or judgment.

 

Pre-Workshop Preparation

  • Compile process mining results and GAF analysis
  • Gather baseline metrics for key business processes
  • Identify cross-functional stakeholders from IT, HR, customer service, business operations, etc.
  • Prepare current state workflow documentation for priority areas (if possible)

Note it may not always be possible to get all these pre-reqs, and that’s okay.

 

Workshop Structure

Start with an unconstrained brainstorming session using an A-Z framework (Appendix A). This systematic approach ensures comprehensive coverage by forcing participants to think of use cases for each letter of the alphabet. Include stakeholders from IT, HR, customer service, and business operations (depending on your area of focus).

 

The session should be structured as follows:

  • Opening (30 minutes): Review data insights from process mining and GAF analysis
  • Brainstorming (90 minutes): A-Z framework application with cross-functional input
  • Categorization (45 minutes): Group similar use cases and identify themes
  • Initial Scoping (45 minutes): Add preliminary effort and impact estimates

The goal is comprehensive capture, not immediate filtering. Encourage participants to think beyond their functional silos and consider use cases that span multiple departments or processes.

 

Phase 2: Rigorous Evaluation Matrix

The second phase applies systematic evaluation criteria to transform the brainstormed list into a prioritized portfolio of opportunities.

 

Evaluation Dimensions

 

Plot each potential use case across two critical dimensions:

 

Business Value Assessment

  • Cost Reduction Impact: Quantified savings in labor hours, operational expenses, or resource utilization
  • Revenue Generation Potential: Direct revenue impact through improved customer experience, faster service delivery, or new capability enablement
  • Risk Mitigation Value: Reduction in compliance violations, security incidents, or service disruptions
  • Strategic Alignment: Connection to organizational priorities and transformation initiatives

 

Implementation Feasibility Analysis

  • Technical Complexity: Assessment of required integrations, customizations, and technical debt considerations
  • Data Availability: Quality, completeness, and accessibility of required data sources
  • Organizational Readiness: Change management requirements, user adoption challenges, and training needs
  • Resource Requirements: Development time, ongoing maintenance, and specialized skill needs

 

The Strategic Portfolio Matrix

This evaluation creates four strategic categories that guide implementation planning:

RiteshShahAI_8-1753806631556.png

 

  • Do Now: High value, high feasibility (immediate priorities for quick wins and momentum building)
  • Build Muscle: Medium value, high feasibility (learning opportunities that develop organizational AI capabilities)
  • Reframe: High value, low feasibility (longer-term targets that may require foundational improvements)
  • Abandon: Low value regardless of feasibility (deprioritize or eliminate from consideration)

Dynamic Portfolio Management One of the goals should be to move initiatives from the "Reframe" and "Build Muscle" quadrants toward "Do Now" over time. As you iterate and address prerequisites that made them low value or low feasibility, these use cases can become immediate priorities.

 

Phase 3: Detailed Use Case Design

For prioritized use cases in the "Do Now" category, conduct comprehensive workflow analysis and design planning.

 

Current State Analysis

  • Document existing process flows with decision points and handoffs
  • Identify pain points, bottlenecks, and inefficiencies
  • Map stakeholder interactions and information dependencies
  • Establish baseline performance metrics

 

Future State Design

  • Define AI agent role and responsibilities within the process
  • Specify human-AI collaboration touchpoints
  • Design exception handling and escalation procedures
  • Plan integration points with existing systems and workflows

 

Implementation Planning

  • Data Dependency Analysis: Catalog required data sources, quality requirements, and integration complexity
  • Tool Requirement Identification: Specify ServiceNow platform capabilities, third-party integrations, and custom development needs
  • Integration Complexity Assessment: Evaluate technical dependencies, security requirements, and performance considerations
  • Success Metric Definition: Establish measurable outcomes, monitoring approaches, and success criteria
  • Change Management Planning: Design training programs, communication strategies, and adoption support

 

Risk Assessment and Mitigation

  • Identify potential implementation challenges and failure modes
  • Develop contingency plans and rollback procedures
  • Plan pilot testing and phased rollout approaches
  • Establish governance and oversight mechanisms
  • Ensure the use cases align with your values on the Responsible Use of AI

 

Conclusion

 

Successfully identifying and implementing AI agent use cases requires a disciplined approach that combines data-driven insights with strategic business judgment. The methodology we've outlined provides a proven framework for moving from initial exploration to successful deployment.

 

The Four-Step Journey to Identifying AI Agent Use Cases

  1. Define Business Objectives: Start with clear understanding of your strategic priorities and performance gaps
  2. Leverage Data Intelligence: Use process mining and GAF analysis to identify patterns and opportunities in your actual operations
  3. Apply Strategic Evaluation: Plot potential use cases on the value-feasibility matrix to create a balanced portfolio
  4. Execute with Precision: Focus on the top 3 use cases from your "Do Now" quadrant for initial implementation

 

While this is one approach that has worked with our customers, we’d like to learn what else has worked for you.  If you have questions or thoughts, feel free to drop them in the comments—we’ll respond or update the article as needed. And if you found this helpful, please share your feedback or link to it on your preferred platform.

 

This is just the beginning of our series on AI —stay tuned for more!

 

𝘗𝘚: 𝘝𝘪𝘦𝘸𝘴 𝘢𝘳𝘦 𝘮𝘺 𝘰𝘸𝘯 𝘢𝘯𝘥 𝘥𝘰 𝘯𝘰𝘵 𝘳𝘦𝘱𝘳𝘦𝘴𝘦𝘯𝘵 𝘮𝘺 𝘵𝘦𝘢𝘮, 𝘦𝘮𝘱𝘭𝘰𝘺𝘦𝘳, 𝘱𝘢𝘳𝘵𝘯𝘦𝘳𝘴, 𝘰𝘳 𝘤𝘶𝘴𝘵𝘰𝘮𝘦𝘳𝘴.

 

 

“Early” Implementation FAQs: Beyond Use Case Selection

 

This section outlines common “early” implementation considerations beyond the use case identification.

 

“What are some of the scoping considerations to document AI use cases?”

RiteshShahAI_9-1753806631559.png

 

This is how our internal teams at ServiceNow prioritize AI use cases.

RiteshShahAI_10-1753806631575.png

 

𝗦𝗵𝗼𝘂𝗹𝗱 𝘄𝗲 𝗳𝗼𝗹𝗹𝗼𝘄 𝘁𝗵𝗲 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝗡𝗼𝘄 𝗔𝗜 𝗳𝗲𝗮𝘁𝘂𝗿𝗲 𝗿𝗼𝗮𝗱𝗺𝗮𝗽 𝘀𝗲𝗾𝘂𝗲𝗻𝗰𝗲 𝗼𝗿 𝗰𝗮𝗻 𝘄𝗲 𝗷𝘂𝗺𝗽 𝘀𝘁𝗿𝗮𝗶𝗴𝗵𝘁 𝘁𝗼 𝗔𝗜 𝗔𝗴𝗲𝗻𝘁𝘀?"

A lot of customers assume you need to crawl → walk → run:

𝗖𝗿𝗮𝘄𝗹: Fulfiller Persona - Now Assist OOTB skills (case summarization, knowledge generation, etc.)
𝗪𝗮𝗹𝗸: Requestor Persona - Virtual Agent (AI chat) on Portal with AI Search and Conversational Catalog
𝗥𝘂𝗻: AI Agents for exponential value


𝗥𝗲𝗮𝗹 𝗲𝘅𝗮𝗺𝗽𝗹𝗲 𝘁𝗵𝗮𝘁 𝗰𝗵𝗮𝗹𝗹𝗲𝗻𝗴𝗲𝘀 𝘁𝗵𝗶𝘀 𝗰𝗼𝗻𝘃𝗲𝗻𝘁𝗶𝗼𝗻𝗮𝗹 𝗮𝗽𝗽𝗿𝗼𝗮𝗰𝗵: Customer gets 70% of cases via email. Following the "roadmap" would mean building chat/portal experiences they'd find hard to transition to (OCM!)

Instead, start with AI Agent that intercepts emails, processes them, and routes only complex cases to humans. Solve the actual problem. And then over time transition to portal / virtual agent experiences.

𝗧𝗵𝗲 𝗿𝗶𝗴𝗵𝘁 𝗮𝗽𝗽𝗿𝗼𝗮𝗰𝗵:
→ Identify your North Star business goal or problem
→ Work backwards to the solution
→ Build the right foundation that serves that goal.

You still need to crawl - walk - run, but towards your business goals.

Your business goal or problem dictates the path, not ServiceNow's feature/roadmap sequence.

Stop following product roadmaps. Start following results.

 

 

“Should we just turn on OOB AI Agents?"

Yes, if they meet your business objectives.  Whenever you have a business objective, your first step is to check if there’s an OOB AI agent that meets that goal. If yes, turn it on.  However, be intentional and do not turn on OOB AI agents just because they are available.  Even a smallest AI Agent consumes considerable number of assists ($) and it builds up over time with executions.   

 

What are the critical success factors to deliver ROI, beyond identifying the right use case?

Executive Alignment: The Foundation of Success

The most successful AI agent programs we've observed share common characteristics in their executive approach:

  • Business Strategy Integration: AI agents are positioned as enablers of broader digital transformation initiatives, not standalone technology projects.
  • Clear ROI Frameworks: 63% of organizations plan to increase AI investment by 2026, but successful ones establish clear measurement criteria upfront.
  • Resource Commitment: Beyond technology costs, successful implementations require dedicated change management, training, and ongoing optimization resources.

Avoiding the Science Experiment Trap

The difference between strategic AI agent deployment and expensive experimentation lies in disciplined execution:

  • Start with Proven Patterns: Build initial capabilities around well-established use cases before attempting innovation.
  • Time-boxed Pilots: Establish clear timelines and success criteria for proof-of-concept phases.
  • Business Value Focus: Prioritize operational impact over technical sophistication.
  • Iterative Learning: Plan for multiple deployment cycles with progressive complexity increases.

Organizational Capability Building

AI will create 500,000 net new jobs by 2025, but capturing this value requires deliberate skill development:

  • Platform and AI Expertise: Deep ServiceNow platform capabilities are foundational. The newer AI skills are the future
  • Process Analysis: The ability to map, measure, and optimize workflows is critical for identifying automation opportunities.
  • Change Management: Organizational transformation skills are essential for adoption success.
  • Data Analysis: Interpreting process mining and performance data drives continuous improvement.

Adoption and Change Management: The Human Element

Technical implementation represents perhaps 30% of AI agent deployment success. The remainder depends on organizational adoption:

  • User-Centered Design: Include end-users in workflow design and testing phases to ensure practical utility.
  • Comprehensive Training: Develop education programs that address both new workflows and underlying technology concepts.
  • Communication Strategy: Maintain transparent, consistent messaging about changes and expected benefits.
  • Feedback Integration: Establish mechanisms for continuous user input and rapid iteration.
  • Success Amplification: Recognize and publicize early wins to build organizational confidence.

 

 

Appendices

 

Appendix A: A-Z framework

The A-Z brainstorming framework is a systematic creative thinking technique that uses each letter of the alphabet as a prompt to generate ideas around a specific topic or problem. Here's how it works:

Basic Process: You write the letters A through Z vertically on a page, then challenge yourself to come up with at least one idea, solution, or relevant concept for each letter related to your brainstorming topic.

Benefits:

  • Forces comprehensive thinking by requiring 26 different approaches
  • Prevents getting stuck on obvious solutions
  • Encourages exploration of unusual angles (especially challenging letters like X, Y, Z)
  • Provides structure to otherwise chaotic brainstorming
  • Helps overcome mental blocks by giving concrete prompts

Tips for Success:

  • Don't worry about quality initially - focus on quantity
  • Allow creative interpretation of letters (X can be "eXtra" or "eXcellent")
  • Use it as a starting point, then develop the most promising ideas further
  • Consider doing multiple rounds with different perspectives on the same topic

This framework is particularly useful when you need to generate a large number of ideas quickly or when traditional brainstorming feels stuck or repetitive.

 

 

 

 

Version history
Last update:
‎07-31-2025 09:39 AM
Updated by: