Script help to delete large data set using setLimit & deletemultiple

Met_Ish
Kilo Contributor

Hello Community,

I have a large data set to be deleted from the audit(history) table to free up DB space. We have identified the data that needs to be deleted, the count of which is north of 100 million records. 

To be clear, HI will not delete this data from the backend, so up to us.

The challenge is when using deleteMultiple(), setLimit() is ignored, hence you have your delete running unchecked. I can use setLimit() with deleterecord() but this as more performance impact, and takes longer.

 

I need a way to be able to use the setLimit() function in conjuction with the deleteMultple() and iterate through till all the records from my query is deleted. for example;

My query has 1million matched records, i want to delete only 10,000 at a time till all1 million is deleted.

// Code below deletes 10,000 at a time but not efficient

var gr = new GlideRecord(table);
gr.addEncodedQuery(query); //has a million matches
gr.setLimit(10,000);
gr.query();
while (gr.next()) {
gr.deleteRecord()
}

// Code below ignores the 10k limit and deletes 1 million in one shot (potentially have perfomance impacts). How do i structure my code to use deleteMultipe() on the setlimit(10k records) value and iterate through my million matches till they're all deleted?

var gr = new GlideRecord(table);
gr.addEncodedQuery(query); //has a million matches
gr.setLimit(10,000);
gr.query();
gr.deleteMultiple()

3 REPLIES 3

TrevorK
Kilo Sage

Have you looked into Table Cleanup Policies?

https://support.servicenow.com/kb?id=kb_article_view&sysparm_article=KB0694151

Table cleanup runs without the use of GlideRecord lookups (no business rule execution, no workflow, etc.) and is supposed to be pretty quick.

This one has been on my radar as a good practice if we have any business units that want to have limited retention of records, but so far everyone says "keep it forever".

 

Otherwise, when you use the deleteRecord, could you not just do this:

gr.setWorkflow(false);
gr.autoSysFields(false);
gr.deleteRecord();

To speed it up? By bypassing the workflow / business rules / audit, I would assume it would greatly speed it up.

Met_Ish
Kilo Contributor

Unfortunately, table cleanup doesn't work for tables on table rotation (sys_audit is one of them), otherwise that would have been the preferred method.

 

I was looking at it from the lens of deleteRecord running a while loop, hence it wouldnt make much difference (especially as audit table has no BRs to run on delete).

Below is what i've come up with to run  with deletMultiple and seems to be working fine, much improved performance. Write the record sysids i want into an array, then query them and delete in one shot.

for (var i = 0; i < tables.length; i++) { //iterate through the table names

var recordsToBeDel = [];

var gr = new GlideRecord(tables[i]); //get 5000 records and put the sys_ids in an array
gr.addEncodedQuery(query);
gr.setLimit(5000);
gr.query();
while (gr.next()) {
recordsToBeDel.push(gr.getValue("sys_id"));
}
delete(validToArr[i], toBeDel); //delete the records in the array created using below function
}

function delete(table, limitRecords) {

var gr = new GlideRecord(table);
gr.addEncodedQuery("sys_idIN" + limitRecords);
gr.query();
gr.deleteMultiple();

}

stevejarman
Giga Guru

Stumbled on this thread. Here's a script that I wrote to address deleting a large number of records in batches. The idea being that you execute this as a background Fix Script. Honestly though, from what I researched, there doesn't seem to be a "good" way to do this in ServiceNow - especially when you're talking about a LOT of records.

 

var targetTable = "alm_hardware"; // WHATEVER YOUR TABLE IS
var batchId = 1;
var batchSize = 5000;
var recordsRemain = true;

while (recordsRemain) {
    var rs = new GlideRecord(targetTable);
    rs.addEncodedQuery("asset_tag=NULL^display_name=NULL^assigned_to=NULL^company=NULL"); // WHATEVER YOUR QUERY IS
    rs.setLimit(batchSize);
    rs.query();

    if (rs.getRowCount() < batchSize) {
        recordsRemain = false;
    }

    if (rs.getRowCount() != 0) {
        gs.log("Began processing batch " + batchId, targetTable + "_Batch_Deletion");

        rs.setWorkflow(false);
        rs.deleteMultiple();

        batchId++;
    }
}