Optimizing a Recursive Script
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-11-2019 06:26 AM
Recently I submitted a HI case to Service Now titled: Optimizing a Recursive Script; the response was less than satisfying.
I’m still not exactly sure what the “max_rel_level” does, but I certainly don’t want to return an “Impacted Accounts Number” to the user that is less than the actual impacted number of the outage.
This type of recursive script in any other database would return a result in milliseconds and I’m not sure why Service Now feels this is acceptable. I’m hoping someone in the community might see a flaw in my script that will speed it up or suggest an alternative way to drill down to the out-edge customer with more haste.
The case read as follows . . . .
CASE: I created a recursive script on the CI relationship table to find the number of impacted Accounts (Customers) on a network node. This is a small device (OLT) that only has 2,554 accounts. To get to them, I have to find devices that are a child in the table, then find the children of that device, then the children of that device. Ultimately, I get to the account, which is then counted. I would expect this to return a small number like this in milliseconds, instead it takes almost 5 seconds.
When I change the device to a “core” piece of equipment, it returns roughly the same number of accounts, but the number of recursive loops is greater and take more like 44 second. How can I optimize this table to return the count faster?
SN RESPONSe: “I reviewed the script internally and you can add a limit to the number of CI that are being tracked and below is an example that can be used. I also reviewed the SQL time for the script and there are no slow SQL's present. “
SN SAMPLE CODE:
var maxLevel = parseInt(gs.getProperty('NotifyAffectedCI.max_rel_level'));
var maxCis = parseInt(gs.getProperty('NotifyAffectedCI.max_cis', '100000'));
// max_events determines how many cis may be tracked in the noNotify array
// increasing to a very large number ( > ~100K) might use too much memory
MY RECURSIVE SCRIPT
var impactedAccounts = new Utility();
var returncount = impactedAccounts.findImpactedAccounts('sys_id');
gs.print(returncount);
findImpactedAccounts: function (parentCI) {
var countAccount = 0;
//get relationship where CI is the parent
var getChildren = new GlideRecord('cmdb_rel_ci');
getChildren.addQuery('parent', parentCI);
getChildren.query();
while (getChildren.next()) {
if (getChildren.child.sys_class_name == 'u_account') {
countAccount++;
} else {
countAccount = countAccount + this.findImpactedAccounts(getChildren.child);
}
}
return countAccount;
},

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-11-2019 11:18 AM
I feel like I dont have the best way to reproduce the slow query however, have you explored using GlideAggregate for this. It is much much faster.
You're code might look like;
findImpactedAccounts: function (parentCI) {
var returnCount = 0;
//get relationship where CI is the parent
var getChildren = new GlideAggregate('cmdb_rel_ci');
getChildren.addQuery('parent', parentCI);
getChildren.addAggregate('COUNT', 'child.sys_class_name');
getChildren.addAggregate('COUNT', 'child');// i think this is needed to access the value in glideaggregate
getChildren.query();
while (getChildren.next()) {//this is now a grouping of childred by class
if (getChildren.child.sys_class_name == 'u_account') {
returnCount += getChildren.getAggregate('COUNT','child.sys_class_name');
} else {
// if its not a account type... check for children
// countAccount = countAccount + this.findImpactedAccounts(getChildren.child);
this.findImpactedAccounts(getChildren.child);
}
}
return returnCount;
},
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-12-2019 05:35 AM
Thanks j, for the suggestion and the posted code.
I'll have some time later this afternoon to try this out and let you know if it speeds up.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-12-2019 06:40 AM
I had some down time this morning and was able to apply your suggestion.
My 44 second response time came down to 25; definitely an improvement.