- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-08-2022 05:01 PM
We have a couple of tables which are growing at a rate on 400k records per quarter in a scoped financial application. Multiple users with various roles run reports and dashboards are using these tables. We are at the point where the reports and dashboards are taking too long to load. So we are looking at best practices for large table maintenance and management.
Thanks.
Solved! Go to Solution.
- Labels:
-
Best Practices
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-18-2022 02:02 AM
Hi Jose,
I think there are a few key questions to understand
How long is the data needed ?
Does it need to be retrieved from ServiceNow ?
An approach that may be useful (depending on the exact requirements) is setting up data archiving rules.
Data archiving involves managing table size growth and archiving old data. It moves data that is no longer needed every day from primary tables to a set of archive tables.
The longer an instance runs, the more likely it is to accumulate data that is no longer relevant. For example, task records from two years ago are typically less relevant than currently active tasks. Old data may eventually cause performance issues by consuming system resources and slowing down queries and reports.
- Set up an archive rule that archives the data and removes it from immediate access to free up system resources.
- Set up a destroy rule to delete the data after a specified period.
- Configure the archive and destroy rule properties that control batch processing of the archive and destroy rules.
You can read more about this on this ServiceNow doc:
https://docs.servicenow.com/en-US/bundle/sandiego-platform-administration/page/administer/database-rotation/concept/c_ArchiveData.html
Additionally what we did on one of our projects is have an integration that were pulled into a data warehouse and therefore can be deleted from ServiceNow as going forward you can request for data from the data team.
Hope that helps.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-13-2022 03:34 PM
Hello Maik,
Thank you for your reply.
We do need to load the all the data from the table, since different users with different levels of access need the data at different points in time. As Stephane commented we only add new records (400K) every month as a part of our month end received services process. We are using indexes in order to optimize the data rendering and reports but as I mentioned we would like to know if there are best practices in specific to manage large tables?
As well as Yousaf posted we are looking into Table rotation, but since we haven't really had any experience with those we would like to hear from the community if there are any pitfalls to watch for?
Which other information would you need to provide a better assessment and a better response to the posting?
Thanks,
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-18-2022 02:02 AM
Hi Jose,
I think there are a few key questions to understand
How long is the data needed ?
Does it need to be retrieved from ServiceNow ?
An approach that may be useful (depending on the exact requirements) is setting up data archiving rules.
Data archiving involves managing table size growth and archiving old data. It moves data that is no longer needed every day from primary tables to a set of archive tables.
The longer an instance runs, the more likely it is to accumulate data that is no longer relevant. For example, task records from two years ago are typically less relevant than currently active tasks. Old data may eventually cause performance issues by consuming system resources and slowing down queries and reports.
- Set up an archive rule that archives the data and removes it from immediate access to free up system resources.
- Set up a destroy rule to delete the data after a specified period.
- Configure the archive and destroy rule properties that control batch processing of the archive and destroy rules.
You can read more about this on this ServiceNow doc:
https://docs.servicenow.com/en-US/bundle/sandiego-platform-administration/page/administer/database-rotation/concept/c_ArchiveData.html
Additionally what we did on one of our projects is have an integration that were pulled into a data warehouse and therefore can be deleted from ServiceNow as going forward you can request for data from the data team.
Hope that helps.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-18-2022 09:21 PM
Very good answers given by experts. Just adding an useful KB around same.
https://support.servicenow.com/kb?id=kb_article_view&sysparm_article=KB0780216
Regards
RP