- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā08-10-2025 05:35 PM
I'm working on cleaning up 2.5M records from a CMDB table and have tried tried multiple approaches such as Background script, Data manager policy, cleanup table option, direct delete from table list view , but each has performance impacts or takes too long. i also tested using the archive rule but there is an time interval for 1 hour, tried to change the interval but it is impacting other archive rules. So my question is "what is the most efficient and safe way to delete such large volume (3.0M) of CMDB records without impacting instance performance or other scheduled jobs with in short time.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā08-11-2025 07:38 AM
Hi @VenkataSaiM .
As per below are best practices which might helps you.
1 ā Key Principles Before Deletion
* Never bulk-delete CMDB data directly in one go on a production instance.
* Always delete in batches to avoid locking DB threads and causing node slowdowns.
* Maintain referential integrity (relationships, child tables, history).
* Consider archiving instead of hard deleting if regulatory/audit requirements exist.
2 ā Best Practice Approach
Step 1 ā Identify and Isolate the Dataset
* Use a saved filter or scripted query to target only the CIs you truly want to delete.
* Ensure they do not have active relationships (check cmdb_rel_ci table).
* Check for dependent records like:
* Tasks (task.cmdb_ci)
* Discovery status records
* Child CI classes
* Use cmdb_ci_delete API or CMDB Util methods for safer cleanup.
Step 2 ā Use Chunked Deletes (Scripted)
Instead of deleting millions in one transaction, batch them in chunks of e.g., 5,000ā10,000 per execution.āØExample background script approach
var gr = new GlideRecord('cmdb_ci_computer'); // Example class
gr.addQuery('install_status', '7'); // Retired or custom condition
gr.setLimit(5000);
gr.query();
while (gr.next()) {
gr.deleteRecord();
}
Tips:
* Run in off-peak hours.
* Repeat until all records are gone.
* Monitor node performance via Node Log Stats.
Step 3 ā Use the deleteMultiple() API
If you can safely build a GlideRecord query:
var gr = new GlideRecord('cmdb_ci_computer');
gr.addQuery('install_status', '7');
gr.deleteMultiple();
This is faster than looping through deleteRecord() because itās a bulk SQL operation, but be careful ā it bypasses some business rules and triggers, so test first.
Step 4 ā CMDB Data Manager Cleanup Policies
* If possible, use CMDB Data Manager to set retirement and cleanup rules.
* Set batch size and execution window.
* Schedule over multiple days to avoid instance slowdowns.
Step 5 ā Offload to Sub-Production First
* Clone Prod ā run the delete in sub-prod ā measure runtime and performance impact.
* Adjust batch sizes based on results before doing in Prod.
Step 6 ā If Archiving Is an Option
* Create a custom archive rule just for this table.
* Run in parallel with other archive rules by isolating it into a separate schedule.
* This preserves historical data but removes load from the CMDB.
3 ā Additional Tips for Safe High-Volume Deletes
* Disable Index Updates Temporarily (if deleting a massive subset) ā re-enable after.
* Disable auditing on the table during cleanup (saves DB writes).
* Use Maintenance Window to avoid conflicting with Discovery or IRE jobs.
* Delete relationships first, then the CI records ā prevents orphans.
4 ā Recommended Sequence
1. Identify CIs ā confirm no dependencies
2. Delete related records from cmdb_rel_ci
3. Batch-delete CI records via chunked script or deleteMultiple()
4. Rebuild indexes after completion
5. Run CMDB Health to confirm no orphaned references
Please appreciate the efforts of community contributors by marking appropriate response as Mark my Answer Helpful or Accept Solution this may help other community users to follow correct solution in future.
Thank You
AJ - TechTrek with AJ - ITOM Trainer
LinkedIn:- https://www.linkedin.com/in/ajay-kumar-66a91385/
YouTube:- https://www.youtube.com/@learnitomwithaj
Topmate:- https://topmate.io/aj_techtrekwithaj (Connect for 1-1 Session)
ServiceNow Community MVP 2025
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā08-10-2025 07:34 PM
Read this for mass deletion best practices,
https://support.servicenow.com/kb?id=kb_article_view&sysparm_article=KB0717791
Thanks,
Bhuvan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā08-10-2025 07:39 PM
Hi @VenkataSaiM,
Scheduled jobs will be a good option but don't try to delete all records in one go - rather create batches and delete. Schedule these jobs periodically untill all batched got deleted. Since data will be deleted in chuncks so it will not impact system performance. I am giving you a dummy code- you can write this script in schduled job and twik it as per your environment.
var batchSize = 1000; // Define batch size
var gr = new GlideRecord('your_table_name');
gr.addEncodedQuery('your_encoded_query'); // Define your deletion criteria
gr.query();
var counter = 0;
while (gr.next() && counter < batchSize) {
gr.deleteRecord(); // Deletes the current record and triggers business rules
counter++;
}
If my answer helped you in anyways, please mark it solution accepted.
Regards,
Nikhil Bajaj
Regards,
Nikhil Bajaj
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā08-10-2025 07:46 PM
Also, have a look on CMDB Data Manager policy option for deletion as it will create a batch of max 10,000 records automatically when more records are marked for deletion and is an optimized way of cleaning up bulk data.
Thanks,
Bhuvan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā08-10-2025 09:57 PM
The SNOW recommended approach is to use low code no code method.
You should be able to achieve this via Data Management Job.
Mass data deletion in ServiceNow using Data Management involves creating and executing a delete job. This process allows for the removal of a large number of records without scripting, offering features like previewing changes and scheduling.
Please follow the steps here:
Please try in lower envs first and ensure proper impact analysis.
Please mark reply as Helpful/Correct, if applicable. Thanks!
Please consider marking my reply as Helpful and/or Accept Solution, if applicable. Thanks!
Thank you!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā08-10-2025 10:13 PM
Hi @VenkataSaiM
Deleting 3.5M records is not easy, and thereās a 100% chance it will impact performance. It may also affect any records where these entries are referenced. You can use ServiceNowās Data Management feature, or a background script, or a flow ā but run the deletion in batches. Donāt delete all 3.5M records in one go; instead, break it down by table, record type, or condition, and then proceed accordingly.
If my response proves useful, please indicate its helpfulness by selecting " Accept as Solution" and " Helpful." This action benefits both the community and me.
Regards
Dr. Atul G. - Learn N Grow Together
ServiceNow Techno - Functional Trainer
LinkedIn: https://www.linkedin.com/in/dratulgrover
YouTube: https://www.youtube.com/@LearnNGrowTogetherwithAtulG
Topmate: https://topmate.io/atul_grover_lng [ Connect for 1-1 Session]
****************************************************************************************************************