The CreatorCon Call for Content is officially open! Get started here.

Deleting all records from large (1TB+) table

Brian Arndt1
Mega Expert

We have a table in our production instance with over 1TB of data in it (1118168054 rows). I'm looking for guidance on the best way to purge this table without affecting system performance. I do not believe there is any reason to retain these records and no new records are being added to it. Would archiving (and destroying) be the most effective, or would it take too long or cause performance issues?

1 ACCEPTED SOLUTION

Brian Arndt1
Mega Expert

Thanks all for the replies but I think the correct answer is submit a case with ServiceNow Support. They can truncate the table in seconds.

View solution in original post

10 REPLIES 10

Manas Kandekar
Kilo Guru

Hi

I think scheduled script will be best to delete the record. Schedule it on non-working days. And limit it to some thousands records at a time.

 

If my answer helped you in any way, please then mark it correct and helpful.

Kind regards,
Manas

Hi Manas,

Thanks for your reply. My only concern with this approach is that it might take a year to get through all the records 🙂

Archana Reddy2
Tera Guru

Hi Brian,

How about navigating to "Tables" module, opening your table's record and doing the same "Delete All Records".

This won't require the whole list to be loaded. 

Hope this helps.

 

Thanks,

Archana

Hi Archana,

Thanks for your reply. My concern with this approach is a hit on performance since the number of records is so high.

Brian Arndt1
Mega Expert

Thanks all for the replies but I think the correct answer is submit a case with ServiceNow Support. They can truncate the table in seconds.