- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Welcome! This article recaps the key insights from our Knowledge 25 session, "Data-Informed Design: How to make decisions while understanding your users." We'll explore (Giu Vicente and Thais Lorenzini Bittencourt) how to harness the power of usage data to make effective, actionable decisions for your work on the ServiceNow platform and beyond.
Why measure? The runner's parallel
Why is designing with data so crucial? Measuring usage helps you understand if your work is hitting the mark. Imagine training for a marathon. You wouldn't just start running aimlessly. You'd prepare, track your progress, and adjust your training based on what you learn. You monitor your pace, how you feel, your heart rate – all to ensure you're moving in the right direction to achieve your 42-kilometer goal.
Similarly, when building applications or portals, you have business goals. You're not just building for the sake of it. Measuring usage helps you understand if your work is hitting the mark. Without data, you're flying blind. Are users adopting that new feature? Are they achieving their goals efficiently? Data provides the answers, helping you make better decisions, prioritize effectively, and ultimately, build better experiences. We're not talking about vanity metrics with big, meaningless numbers, but meaningful, actionable insights.
The 5-step framework for data-informed design
Here’s a five-step framework you can apply to your projects:
1. Identify the focus area
Is there something in your product you are curious about? A recent change you've made or an improvement you believe is needed? The example in our session was on the Service Portal, a common self-service experience. Let’s say you’ve just implemented the AI Search (Smart Search) within it.
2. Generate assumptions (and convert them into hypotheses)
This is a critical step. Assumptions are naturally behind decisions. A hypothesis is a clear, testable statement about what you believe will happen as a result of a product change. Without them, numbers are merely numbers—they lack context and meaning.
This is a critical step. Without assumptions, numbers are just numbers – they lack context and meaning. A hypothesis is a clear, testable statement about what you believe will happen due to a product change.
Why are hypotheses so important?
-
Smarter Decisions: Learn what truly works through data, not just opinions.
-
Faster Iteration: Test and adjust quickly without overbuilding.
-
Better Results: Focus on changes that genuinely impact users and key metrics.
How to write a good hypothesis:
Avoid vague statements like: "We think search is a key feature and will help users find what they are looking for faster."
Instead, use a clearer structure. Let’s start building a template. Here is our first version:
We believe that [the decision]
will [benefit]
for [users]
who want to [users’ job to be done]
For our Search example, a hypothesis could be:
We believe that the search bar in a highly visible position
will minimize the time to value
for Service Portal users
who want to quickly find knowledge articles and service catalog items.
3. Define the right metrics to validate your hypothesis
Once you have a hypothesis, how will you measure its validity? What does success look like? These are your success metrics.
Continuing on our AI Search example, if the hypothesis is that AI Search helps users find what they need quickly, we could add an extra layer to our hypothesis template and end it with:
• as measured by the success rate of finding the correct item in the first or second search results
Now our template is in its final shape:
We believe that [the decision]
will [benefit]
for [users]
who want to [users’ job to be done]
as measured by [success metric]
4. Implementation and collection
This is the technical part of gathering the data. ServiceNow's User Experience Analytics (UXA) provides a powerful toolset for this, especially in Yokohama and later versions. Much of the data is available out-of-the-box, with no complex migration needed.
UXA is built on key components:
-
Events: User actions like clicks, swipes, and menu selections. You can collect properties like screen type, link clicked, resolution, language, etc.
-
Users: Anonymous tracking of individual user interactions.
-
Sessions: A period of continuous user activity, from login to logout.
-
Pages: Tracking interactions and performance across different screens or views.
-
Funnels: Visualizing the sequence of steps users take towards a goal, highlighting drop-offs.
-
Cohorts: Grouping users by shared characteristics or behaviors to compare performance and engagement/retention over time.
While many metrics are available out-of-the-box, custom tracking might require developer assistance to add specific code.
5. Analyze and act on the metrics
With data collected, it's time for analysis. Let's revisit our AI Search example and some potential metrics:
-
Metric Example: Search Success Funnel
-
A powerful way to measure search effectiveness is to define a funnel that represents the ideal user journey. For example:
-
Step 1: User logs In (Indicates an active session start)
-
Step 2: User performs a Search (User interacts with the AI Search bar)
-
Step 3: User clicks the First Result (Indicates the search was precise and delivered relevant results quickly)
-
-
Analysis Example: A high completion rate for this funnel (e.g., a significant percentage of users who search also click a first-page result) would strongly support the hypothesis that AI Search is effective and delivering immediate value. Conversely, a low completion rate, or high drop-off between steps 2 and 3, would indicate that users are not finding what they need on the first try, prompting further investigation into search relevance or result presentation. You can slice the data with filters and conditions and make the analysis the way you want or even the search terms (queries) they have used to learn even more about it.
-
Based on your analysis, there are three primary types of actions you can take:
-
Fix It: You've identified a clear issue (e.g., a high drop-off rate at a specific step in a funnel, or a common search term yielding no relevant results). Now you can ideate and implement a solution.
-
Explore Deeper: Something looks off, but the cause isn't immediately obvious (e.g., users are clicking the third search result much more often than the first). You might need to define new metrics, segment your data further (e.g., by user role), or look at existing ones from a different angle.
-
Talk to Users: Quantitative data tells you what is happening. Qualitative data (user interviews, usability testing) helps you understand why. If you notice strange user behavior, talk to them to understand their interpretation and intent. This is where user research is invaluable.
Conclusion: make data your co-pilot
Measuring isn't just about collecting numbers; it's about understanding the story behind them and using those insights to drive meaningful impact. By:
-
Defining clear, measurable hypotheses.
-
Using tools like User Experience Analytics to investigate these hypotheses from multiple perspectives.
-
Understanding the "why" behind the "what" by talking to your users.
You can ensure you're not just gathering data, but harnessing it to improve, innovate, and make data-driven decision-making a cornerstone of your success.
Resources for Your Journey
While we didn't do the hands-on exercise in this article format, we want to share the resources from the session. You can download the UXA Cheat Sheet and Exercise Materials (PDF), which includes the board and cards used. This can help you practice mapping hypotheses to metrics and understand when to use different UXA components.
-
For more details, refer to the official ServiceNow User Experience Analytics Documentation and the UXA Landing Page for Yokohama.
-
To see UXA in action, check out this video from the Platform Analytics Academy: User Experience Analytics - April 2nd, 2025.
Thank you for joining us on this journey.
- 742 Views
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.