Delete multiple records on large tables

Luiz Lucena
Mega Sage

Hello everyone, hope all of you and your families are safe!

Here is the issue we are facing. We use SCCM import to populate workstations data into ServiceNow.
One of the data source is Network information. It works fine except for millions of Network Adapters with a name NdisWan, no IP Address and each computer containing more than 100 of these adapters with different MAC addresses.

We are trying to implement what is described here:
https://community.servicenow.com/community?id=community_question&sys_id=a5b10f69db98dbc01dcaf3231f96...

But before, we need to clean up the house. Tried to delete multiple records, using a background script (as the one mentioned at ServiceNow Elite). 

However, the script is taking too long to run our DEV instance, more than 3 hours, when I cancelled and it just deleted around 25000. Also, the instance became impossible to use, even for other activities opened in other tabs.

Cannot delete the whole content on that table as it holds other adapters information for servers, otherwise I would delete the entire content on that table.

Anyone has a better approach?

5 REPLIES 5

Aidan Lovegrov1
Tera Contributor

Hi Luiz, I hope you are well during these times.

 

I believe it would be best to create a Fix Script and execute in the background, using the deleteMultiple() function of a GlideRecord. For instance:

var gr = new GlideRecord('table');
gr.addQuery('verify the query in the list view of just records you need');
gr.deleteMultiple();

Ensure your query is correct otherwise other records could be deleted!

Hopefully you do not have any workflows being triggered on those records, otherwise that's another issue of cancelling those contexts.

Philippe Casidy
Tera Guru

Hi Luiz,

Looks like you have already tried the best approaches, isn't it?

 

Do you want to delete all the imported data or only a subset?

Maybe you could request to restore a backup before the import?

Just trying to list options here ^^

 

Philippe

 

Hi Philippe, 


Thanks for answering.

Yes, I'm afraid those are the best approach so far. 
Not sure if we would have back up, since that data is there for more than 2 years.

I'm leaning to open up a HI case.

For now, the script is running in another browser while I continue to work on main browser. 🙂

Thanks,

Matt102
Giga Guru

Hi,

Just a thought, which kind of assumes you have some time and are not still filling up with records you want to delete.

In a background script first use the glide record setLimit() to determine a reasonable chunk of rows to process.

Play with your subset of rows and see if deleteRecord() or deleteMultiple() runs quicker.

Set the appropriately row limited script up as a scheduled task and run it as best fits the performance impact and working hours.

HTH,

Matt