best practices for loading 100K+ CMDB records?

dataWrangler
Giga Expert

Greetings -


We're currently interrogating the source systems providing the data that will populate the ServiceNow CMDB and estimate that there will be 800K or so assets to track.

Any thoughts/caveats/recommended approaches/lessons learned/best practices for loading this quantity of data?

2 REPLIES 2

AnveshKumar M
Tera Sage
Tera Sage

Hi @dataWrangler 

What is the source system?

1. If the source system provides a JDBC connector, consider using Data sources and transform maps.

 

2. If it supports REST API you can still create scripted Data sources and transform maps again.

 

3. Consider importing in batches.

 

4. Use multiple user account, why because ServiceNow has resource limitations at user level too. By using multiple user accounts you can run import jobs parallelly.

 

5. Search ServiceNow store if there are any connectors available for your system.

 

 

These are few from my experience.

 

Thanks,
Anvesh

Sumanth16
Kilo Patron

Hi @dataWrangler ,

 

The better approach would be an import via Integration Hub ETL, but I have no experience with such large data amounts and how it could affect your instance

 

For importing millions of records, I use another architectural approach described at How to import 4 million records in 3 hours

 

if my answer has helped with your question, please mark my answer as accepted solution and give a thumb up.

 

Thanks & Regards,

Sumanth Meda