I need a script for latest communication date
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎02-26-2024 02:30 AM
I need a Background script for latest communication date , which give me a Host Name which having a Latest Communication Date more than 5 times triggered for particular Computer.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎02-26-2024 03:04 AM
Not sure if I completely understand your requirement, but maybe this can help:
var computerIdsWithFrequentUpdates = [];
var grHistory = new GlideRecord('sys_history_line');
grHistory.addQuery('fieldname', 'latest_communication_date'); // Field name to check for updates
grHistory.addQuery('tablename', 'cmdb_ci_computer'); // Table name where the field resides
grHistory.orderBy('set.id'); // Group by changes made in the same update set
grHistory.query();
var currentSetId = '';
var updateCount = 0;
var lastSysId = '';
while (grHistory.next()) {
if (lastSysId == grHistory.sys_id.toString()) {
if (currentSetId != grHistory.set.id.toString()) {
updateCount++;
currentSetId = grHistory.set.id.toString();
}
} else {
if (updateCount > 5) {
computerIdsWithFrequentUpdates.push(lastSysId);
}
lastSysId = grHistory.sys_id.toString();
currentSetId = grHistory.set.id.toString();
updateCount = 1;
}
}
if (updateCount > 5 && !computerIdsWithFrequentUpdates.includes(lastSysId)) {
computerIdsWithFrequentUpdates.push(lastSysId);
}
gs.info('Computer IDs with more than 5 updates to Latest Communication Date: ' + computerIdsWithFrequentUpdates.join(', '));
Please mark any helpful or correct solutions as such. That helps others find their solutions.
Mark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎02-26-2024 03:22 AM
Hey @Mark Manders
This script not working for me .
Error shown
0:00:00.011] Compacting large row block (file.write: sys_history_line 10000 rows 160000 saveSize)
[0:00:00.004] Expanding large row block (file.read: sys_history_line, 10000 rows, 160000 dataSize)
[0:00:00.006] Compacting large row block (file.write: sys_history_line 4744 rows 75904 saveSize)
[0:00:00.162] Compacting large row block (file.write: sys_history_line 10000 rows 11815160 saveSize)
[0:00:00.001] Expanding large row block (file.read: sys_history_line, 4744 rows, 75904 dataSize)
[0:00:00.075] Expanding large row block (file.read: sys_history_line, 10000 rows, 11815160 dataSize)
[0:00:00.005] Compacting large row block (file.write: sys_history_line 4744 rows 188400 saveSize)
[0:00:00.003] Expanding large row block (file.read: sys_history_line, 4744 rows, 188400 dataSize)
[0:00:00.107] Compacting large row block (file.write: sys_history_line 10000 rows 11815160 saveSize)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎02-27-2024 07:20 AM
Try doing it with extra filtering on the data. You are fetching too much. Try it on one computer to see what it returns. And as I said: it could be that I am understanding your question wrong.
Please mark any helpful or correct solutions as such. That helps others find their solutions.
Mark