- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-07-2022 10:09 AM
Hello - I have recently been tasked with building the reporting process for our team and our leaders. I'm at a loss as to where to start since Performance Analytics is very over whelming. Looking for suggestions as to where to start.
Solved! Go to Solution.
- 1,410 Views

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-07-2022 01:47 PM
Hi Denise,
You've received some good replies already, but I wanted to provide some additional resources. I'm part of the PA and Reporting product team.
If you are looking for a starting point for training, we have a great (free!) Performance Analytics Essentials self-paced class. It takes around 90 minutes to complete, and provides an excellent overview of what PA is and how it works.
In addition to what has already been shared, I highly recommend starting from the Performance measurement and analytics section of the Customer Success Center. If the entire workbook seems daunting, just the FAQ link (2 page document) is a great starter.
Since you want to look at this as a program, I also recommend our white paper on Governance in Performance Analytics and Dashboards. It provides good practice guidance based on real-world customer experience.
I also want to clarify that it is possible to get historical scores on past data, with some important caveats. PA scores are intended to be driven by dates within records, as opposed to field state changes. For example, to identify which incidents were open on a given day, the OOTB condition look like this:
When PA collects scores for a day, the day being collected replaces "Today" in the above condition. This means I can look back historically and get an accurate count of the number of open incidents from any previous day. The caveat is for some breakdowns of this score. Say you want to know the count of open incidents broken out by assignment group. The assignment group count is not reliable if I collect scores today for incidents open last month, as the assignment group has very likely changed on several incidents over time. But the main count of the number of open incidents should remain accurate.
I hope this helps get you started! Please reach out to your ServiceNow account team if you'd like additional guidance!
Dan Kane
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-07-2022 10:47 AM
This community forum is a good staring place. https://community.servicenow.com/community?id=community_forum&sys_id=b5291a2ddbd897c068c1fb651f9619e2

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-07-2022 11:11 AM
Are the reports for "what does this thing look like right now?" (standard reporting) or "how has this thing been trending over time?" (performance analytics), or both?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-07-2022 11:26 AM
It is both. Looking at standard reporting and trend reporting. We only have approx 8 months of data, so some trending can be captured, but for comparison reporting we can't start that until next year.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-07-2022 11:52 AM
I'm no PA expert, so take this with a grain of salt, but I don't think you can reach back into the past with a new PA config. The config itself takes regular "snapshots" (my term, not SN's) of the state of the thing you want to monitor, and is therefore able to show the trend. But it starts on the day you configure it, not eight months ago. If anyone else has a trick up their sleeve for this, by all means pitch in. The other thing about PA is that in most cases, it is a separate subscription. You'll want to check your licensing before you start going down that road.
I also can't be prescriptive about what you should build because I don't know what you want to measure, but here are some general suggestions:
- First, find out what the team and leaders are trying to achieve with the reports. What purpose will each report serve?
- Once you know the purpose, review with the stakeholders some of the following:
- What conditions apply? For example, would you like to see only open incidents (if we're talking ITSM here, you didn't specify), or do you need to see all incidents? Do you need to see just your group's incidents, or the whole shebang?
- What's the best fit from a visualization perspective? Would a bar chart work, such as showing open incidents by status (e.g. one bar for new, another for work in progress, another for awaiting info, etc.)?
- How will the report be consumed? Will the consumers see it as a widget on a dashboard, will it be sent to them as a PDF via email on a regular basis? Both?
Once you get those answers, you essentially have 90% of the configuration of the report. Be iterative about it; take a first swing at it with your 90% information, and present it to the stakeholders. They'll give you the other 10% at that point.