- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-23-2024 10:31 PM - edited 04-23-2024 10:45 PM
I have added setLimit also but getting Transaction cancelled - Available memory is almost depleted , please suggest, after query my total records are 28k and childgroup length is approx 26k, but when i checked in baground script its giving error.
Note: it is working fine if I add setLimit(1k)
var gr = new GlideRecord('cmdb_rel_ci');
gr.addEncodedQuery('child.category=Hardware^parent.install_statusNOT IN111,14^child.install_statusNOT IN111,14');
gr.setLimit(10000);
gr.query();
gr.query();
var childGroup = {}; // Object to store child records and their parent details
while (gr.next()) {
var childSysId = gr.child.sys_id.toString();
var parentDisplayValue = gr.parent.getDisplayValue();
var parentOwnedBy = gr.parent.owned_by;
var childOwnedBy = gr.child.owned_by.getDisplayValue();
var childCorrelationId = gr.child.correlation_id;
if (!childGroup[childSysId]) {
childGroup[childSysId] = {
parentOwnedBy: parentOwnedBy,
childCorrelationId: childCorrelationId,
childOwnedBy: childOwnedBy
};
}
}
gs.info("childGroup Length123: " + Object.keys(childGroup).length);
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-11-2024 04:30 AM
Hi @Community Alums ,
Yes, you can use scheduled job or script action both will work fine. make sure to use the setlimit() function to create batches for your script.
If my response helped you, please click on "Accept as solution" and mark it as helpful.
Thanks
Suraj
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-11-2024 04:22 AM
Hi ,
instead of using it in background script try to run it in scheduled job.
If my response helped you, please click on "Accept as solution" and mark it as helpful.
Thanks
Suraj
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-11-2024 04:24 AM
Hi @surajchacherkar,
I am new to the ServiceNow, will scheduled job handle large ammount of data.
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-11-2024 04:30 AM
Hi @Community Alums ,
Yes, you can use scheduled job or script action both will work fine. make sure to use the setlimit() function to create batches for your script.
If my response helped you, please click on "Accept as solution" and mark it as helpful.
Thanks
Suraj