Get a first look at what's coming. The Developer Passport Australia Release Preview kicks off March 12. Dive in! 

Best Practices for fetching Archived records.

gangaram_ka
Mega Contributor

Hi Community,

We have a requirement to fetch archival records that are more than one year old. Before implementing this, I would like to understand:

  1. What are the best practices for retrieving archived records in ServiceNow?
  2. Will querying or accessing these archival tables impact system performance?
  3. Is there a recommended approach (reports, scheduled jobs, direct table query, etc.) to safely retrieve older archived data?

Any guidance or recommendations would be greatly appreciated.

 

Thanks,
Gangaram Kasture

2 REPLIES 2

Tanushree Maiti
Tera Sage

Hi @gangaram_ka 

 

Find your ans:

1. Archival best Practice: https://www.perspectium.com/blog/servicenow-archiving-best-practices/

  Also refer:https://www.servicenow.com/community/developer-blog/optimizing-servicenow-data-lifecycle-archiving-r...

 

2. As per this post: https://www.servicenow.com/community/developer-forum/data-archiving-performance-impact/m-p/1996860

By default an archive rule follows these processing rules:

 

  • Archives 100 records for each batch job
  • Sleeps 1 second between batch jobs
  • Runs 10 batch jobs in an archive run (every hour)

 

If you change the interval of job from 1 hour to some lower value, job may sometimes not finished in 1 hour and hence it will not run but if archive threads will pick up to run on other node it can cause problems. Check this kb article:

 

https://hi.service-now.com/kb_view.do?sys_kb_id=116b858e0ffd424098f7982be1050efe



You still have option to run 10 batch jobs every hour and 100 records per batch job (1000 records). And performance of job depends on several factors like size of table, indexes, performant queries and instance config. So before moving forward I would recommend to clone one subprod from prod and check the performance if archive job.

 

3.  Check this: https://www.servicenow.com/community/developer-forum/reporting-on-archived-tables/td-p/1961315/page/...

 

 

Overall all data archiving details , you will get here:

https://www.servicenow.com/community/servicenow-ai-platform-articles/data-archiving-walkthrough/ta-p...

 

Please mark this response as Helpful & Accept it as solution if it assisted you with your question.
Regards
Tanushree Maiti
ServiceNow Technical Architect
Linkedin:

vaishali231
Tera Guru

hey @gangaram_ka 

handling archived data properly is important to avoid unexpected performance issues.
Best practices for retrieving archived records
Archived records are stored in separate tables (e.g., ar_incident, ar_case), so you’ll need to query those tables directly.

Always use well-defined filters (especially on date fields like sys_created_on or sys_archived) to avoid full table scans.

Limit the number of fields returned and avoid unnecessary data processing.

 

For repeated use cases, consider creating a dedicated module or report instead of ad-hoc queries.

 

Performance considerations
Accessing archive tables does not directly impact active table performance, since they are stored separately.

However, archive tables can be very large, so:

Unfiltered queries can lead to slow execution

Large reports may consume node resources

Avoid using archive data in real-time transactions (forms, client scripts, portals, VA).

 

Recommended approaches
1. Scheduled Jobs (Preferred for large data)
Run during off-peak hours

Fetch only required records

Store results in a staging table or export if needed


2. Reporting on archive tables
Suitable for audit or occasional access

Ensure filters (e.g., older than 1 year) are applied


3. Background / Fix scripts
Good for one-time data retrieval or analysis

Test in sub-prod due to potential volume impact

 

Things to avoid
Querying archive tables in synchronous flows or user-facing transactions

Running unfiltered GlideRecord queries on ar_* tables

Mixing active and archived data in a single real-time query

 

Additional recommendations
Verify indexing on commonly queried fields (especially date columns)

Use GlideAggregate for counts instead of looping through records

Align with your data retention and archiving strategy before querying


*************************************************************************************************************************************

If this response helps, please mark it as Accept as Solution and Helpful.
Doing so helps others in the community and encourages me to keep contributing.
Regards
Vaishali Singh