- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-17-2025 04:48 AM
Hi,
I have added the Gauge Data Visualization component in UI builder which is taking long time to load (25~30 Seconds) when i preview it.
Below is the customization that i have done in UI Builder
/**
* @Param {params} params
* @Param {api} params.api
* @Param {TransformApiHelpers} params.helpers
*/
function evaluateProperty({api, helpers}) {
var data = [{
"data": [{
"value": api.data.cmdb_ci_governance_server_1.output.Coverage[0].Percentage,
"change": "0",
"changePercent": "0"
}],
"metadata": {
"eventData": {
"indicatorSysid": ""
},
"dataSourceLabel": "",
"filterQuery": "",
"aggregate": {
"fieldType": "decimal"
},
"format": {
"unitFormat": "{0}",
"frequency": "daily",
"precision": 0
}
}
}];
return data;
}
Data Resource - Transform:
[
{
"name": "ciType",
"label": "CI Type",
"description": "Type of the CI",
"readOnly": "false",
"fieldType": "string",
"mandatory": true,
"defaultValue": ""
},
{
"name": "usedByOrg",
"label": "Used By Org",
"description": "Used By Org",
"readOnly": "false",
"fieldType": "string",
"mandatory": true,
"defaultValue": ""
}
]
Script:
function transform(input) {
var start = new Date().getTime();
var usedByOrgVal = input.usedByOrg;
var arrFinCount = [];
if (usedByOrgVal == "All") {
arrFinCount = [filterList("operational_status!=6"), filterList("operational_status!=6^owned_by!=null"), filterList("operational_status!=6^u_in_coverage=Yes"), filterList("operational_status!=6^u_complete=Yes"), filterList("operational_status!=6^u_correct=Yes", "operational_status!=6^u_in_coverage=No"), filterList("operational_status!=6^u_complete=No"), filterList("operational_status!=6^u_correct=No"), filterList("nameISNOTEMPTY^operational_status!=6^u_resp_orgISNOTEMPTY^NQnameISNOTEMPTY^operational_status!=6^u_resp_orgISEMPTY^assetISEMPTY")];
} else {
arrFinCount = [filterList("operational_status!=6^owned_by.u_program=" + usedByOrgVal), filterList("operational_status!=6^owned_by!=null^owned_by.u_program=" + usedByOrgVal), filterList("operational_status!=6^u_in_coverage=Yes^owned_by.u_program=" + usedByOrgVal), filterList("operational_status!=6^u_complete=Yes^owned_by.u_program=" + usedByOrgVal), filterList("operational_status!=6^u_correct=Yes^owned_by.u_program=" + usedByOrgVal), filterList("operational_status!=6^u_in_coverage=No^owned_by.u_program=" + usedByOrgVal), filterList("operational_status!=6^u_complete=No^owned_by.u_program=" + usedByOrgVal), filterList("operational_status!=6^u_correct=No^owned_by.u_program=" + usedByOrgVal), filterList("nameISNOTEMPTY^operational_status!=6^u_resp_orgISNOTEMPTY^NQnameISNOTEMPTY^operational_status!=6^u_resp_orgISEMPTY^assetISEMPTY^owned_by.u_program=" + usedByOrgVal)];
}
function filterList(arrVal) {
var cal = new GlideRecord("cmdb_ci_server");
cal.addEncodedQuery(arrVal);
cal.query();
return cal.getRowCount();
}
var obj = {
Ownership: [{
Value: arrFinCount[1],
Percentage: isNaN(((arrFinCount[1] / arrFinCount[0]) * 100).toFixed(2)) ? "0.00" : ((arrFinCount[1] / arrFinCount[0]) * 100).toFixed(2),
}],
Coverage: [{
InCoverage: arrFinCount[2],
OutOfCoverage: arrFinCount[5],
Percentage: isNaN(((arrFinCount[2] / arrFinCount[0]) * 100).toFixed(2)) ? "0.00" : ((arrFinCount[2] / arrFinCount[0]) * 100).toFixed(2),
}],
Completeness: [{
Complete: arrFinCount[3],
NotComplete: arrFinCount[6],
Percentage: isNaN(((arrFinCount[3] / arrFinCount[0]) * 100).toFixed(2)) ? "0.00" : ((arrFinCount[3] / arrFinCount[0]) * 100).toFixed(2),
}],
Correctness: [{
Correct: arrFinCount[4],
NotCorrect: arrFinCount[7],
Percentage: isNaN(((arrFinCount[4] / arrFinCount[0]) * 100).toFixed(2)) ? "0.00" : ((arrFinCount[4] / arrFinCount[0]) * 100).toFixed(2),
}],
ActiveServerCount: [{
Count: arrFinCount[8]
}],
};
var end = new Date().getTime();
var time = end - start;
gs.info('CMDB: Execution time Server: ' + time);
return obj;
}
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-19-2025 11:59 PM - edited 01-20-2025 12:01 AM
1. create new field on table of type JSON
2. read how to create scheduled job (https://www.servicenow.com/docs/bundle/xanadu-platform-administration/page/administer/reference-page...)
3. store the response object from the script into that JSON field within scheduled job
4. on page load query that field, parse the response and use it in your compoentns
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-17-2025 11:49 PM
Does the customer need to have precise numbers at the time of the dropdown selected or each time dropdown is selected it can have rough estimate? What Im trying to acheive is to move that logic into scheduled job and result for both array queries will be stored into JSON field somewhere at your table. You can than simply query that one field without computation a display the data. Problem is how often will you need that data to be fresh, because you could slower the instance for all in general
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-18-2025 12:45 AM
So each time a dropdown option is selected the data resource runs and the query is updated with the dropdown option, For example from the code this line:
"operational_status!=6^owned_by.u_program=" + usedByOrgVal
Here usedByOrgVal will store the dropdown option.
Also when the dropdown option is "All", Which is the initial case when the page load the query will be without the usedByOrgVal value, For example from the code this line:
"operational_status!=6^owned_by.u_program"
How can we move the logic into a scheduled job? Can we try doing that approach?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-18-2025 01:47 AM
Before moving to that approach please read documentation how scheduled job works if you are not familiar with it. Maybe this functionality is not desired.
What i would try first is query all of the data immediatelly on page load and store them into state. Then i would use client script to determine what data will be loaded to that components from state.
You will have one big query at the beggining but then switching between values should be smooth
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-18-2025 03:31 AM
Please let me know if my understanding is correct.
Step1: Trigger the data resource on page ready and store the object output into the client state parameters using the client script.
Step2: Using the client state parameter value in the data visualization
Below is the snapshot:
Step 1:
Client Script:
Calling the client parameter value in the data visualization
It is still taking 18~20 minutes to load using this approach. Please let me know if this the approach that you were explaining.
I believe that the code in the data resource is taking long time due to the data pull through queries passed.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-18-2025 04:25 AM - edited 01-18-2025 04:30 AM
you have 3 choices:
1. leave it as it is and each dropdown option will take 20 seconds to load
2. perform one big query to pull ALL IF AND ELSE queries together and cache them into client state and then pull the data from state according dropdown option selected. I would suggest to add loading for that so that user cannot interact with the page till it's not loaded
3. create scheduled job that runs once in some time period to perform both queries and store the result into JSON field. PLEASE NOTE: i would be very carefull with schedule jobs that are heavy for computation as they can slow down the entire instance for EVERYONE so make sure you trigger the scheduled job as few as possible and that you will test the performance. Once you have data available in your JSON field...all you need to do is to query that field to get the data...that should be super fast. The biggest downside of this approach is that you don't have fresh data untill next trigger of schedule job.