What would be the best way to manage large tables (8+ Million records)? What is a recommended limit to keep? Is table rotation a best practice to follow?

Jose Alva BCGov
Tera Contributor

We have a couple of tables which are growing at a rate on 400k records per quarter in a scoped financial application. Multiple users with various roles run reports and dashboards are using these tables. We are at the point where the reports and dashboards are taking too long to load. So we are looking at best practices for large table maintenance and management.

Thanks.

1 ACCEPTED SOLUTION

Sean Walters
Tera Expert

Hi Jose, 

I think there are a few key questions to understand

How long is the data needed ? 

Does it need to be retrieved from ServiceNow ? 

An approach that may be useful (depending on the exact requirements) is setting up data archiving rules. 

Data archiving involves managing table size growth and archiving old data. It moves data that is no longer needed every day from primary tables to a set of archive tables.

The longer an instance runs, the more likely it is to accumulate data that is no longer relevant. For example, task records from two years ago are typically less relevant than currently active tasks. Old data may eventually cause performance issues by consuming system resources and slowing down queries and reports.

If you cannot delete this data because you need it for auditing or for historical purposes:
  1. Set up an archive rule that archives the data and removes it from immediate access to free up system resources.
  2. Set up a destroy rule to delete the data after a specified period.
  3. Configure the archive and destroy rule properties that control batch processing of the archive and destroy rules.


You can read more about this on this ServiceNow doc: 
https://docs.servicenow.com/en-US/bundle/sandiego-platform-administration/page/administer/database-rotation/concept/c_ArchiveData.html 

 Additionally what we did on one of our projects is have an integration that were pulled into a data warehouse and therefore can be deleted from ServiceNow as going forward you can request for data from the data team. 

Hope that helps.

View solution in original post

7 REPLIES 7

Hello Maik, 

Thank you for your reply. 

We do need to load the all the data from the table, since different users with different levels of access need the data at different points in time. As Stephane commented we only add new records (400K) every month as a part of our month end received services process. We are using indexes in order to optimize the data rendering and reports but as I mentioned we would like to know if there are best practices in specific to manage large tables?

As well as Yousaf posted we are looking into Table rotation, but since we haven't really had any experience with those we would like to hear from the community if there are any pitfalls to watch for?

Which other information would you need to provide a better assessment and a better response to the posting?

 

Thanks, 

Sean Walters
Tera Expert

Hi Jose, 

I think there are a few key questions to understand

How long is the data needed ? 

Does it need to be retrieved from ServiceNow ? 

An approach that may be useful (depending on the exact requirements) is setting up data archiving rules. 

Data archiving involves managing table size growth and archiving old data. It moves data that is no longer needed every day from primary tables to a set of archive tables.

The longer an instance runs, the more likely it is to accumulate data that is no longer relevant. For example, task records from two years ago are typically less relevant than currently active tasks. Old data may eventually cause performance issues by consuming system resources and slowing down queries and reports.

If you cannot delete this data because you need it for auditing or for historical purposes:
  1. Set up an archive rule that archives the data and removes it from immediate access to free up system resources.
  2. Set up a destroy rule to delete the data after a specified period.
  3. Configure the archive and destroy rule properties that control batch processing of the archive and destroy rules.


You can read more about this on this ServiceNow doc: 
https://docs.servicenow.com/en-US/bundle/sandiego-platform-administration/page/administer/database-rotation/concept/c_ArchiveData.html 

 Additionally what we did on one of our projects is have an integration that were pulled into a data warehouse and therefore can be deleted from ServiceNow as going forward you can request for data from the data team. 

Hope that helps.

Rahul Priyadars
Giga Sage
Giga Sage

Very good answers given by experts. Just adding an useful KB around same.

https://support.servicenow.com/kb?id=kb_article_view&sysparm_article=KB0780216

Regards

RP