- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2024 05:48 AM
Hello all ,
Recently i came across the incident activitylog table which has nearly 60M records and since this data is not required and its growing , we planned to keep only 3year data and everything older than 3year has to be deleted and table should be maintained with 3year data only .
for maintaining we can use table cleaner , but i am confused about first-time delete will table cleaner will be able to do it without impacting the performance ??
what should be the best practise to be implemented here
Solved! Go to Solution.
- Labels:
-
Architect
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2024 06:27 AM
you can do it via scheduled jobs scheduled on weekends and/or after business hours, and by smaller parts e.g. setLimit(100000);
Also, consider this: Data archiving (ServiceNow documentation).
/* If my response wasn’t a total disaster ↙️ ⭐ drop a Kudos or Accept as Solution ✅ ↘️ Cheers! */
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2024 11:30 AM - edited 10-11-2024 11:40 AM
If it is a regular table, then just add a Table Cleaner job to handle the deletion. The Table Cleaner starts once a hour and deletes data for 20 minutes. Never had any issues even with larger tables. You can off-course set the filter condition to start from a time period longer back in time to make the deletable data amount smaller, eg. sys_created_on relative before xx days. And then update the condition as the deletes proceed. I usually do a glideaggregate/trend by year-month analysis if I need to delete a larger amount for the first time, so I know how to build the filter conditions if I wanna play it safe and just delete 100K at each round.
BR, Jan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2024 06:27 AM
you can do it via scheduled jobs scheduled on weekends and/or after business hours, and by smaller parts e.g. setLimit(100000);
Also, consider this: Data archiving (ServiceNow documentation).
/* If my response wasn’t a total disaster ↙️ ⭐ drop a Kudos or Accept as Solution ✅ ↘️ Cheers! */
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-11-2024 11:30 AM - edited 10-11-2024 11:40 AM
If it is a regular table, then just add a Table Cleaner job to handle the deletion. The Table Cleaner starts once a hour and deletes data for 20 minutes. Never had any issues even with larger tables. You can off-course set the filter condition to start from a time period longer back in time to make the deletable data amount smaller, eg. sys_created_on relative before xx days. And then update the condition as the deletes proceed. I usually do a glideaggregate/trend by year-month analysis if I need to delete a larger amount for the first time, so I know how to build the filter conditions if I wanna play it safe and just delete 100K at each round.
BR, Jan