best practices for loading 100K+ CMDB records?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-07-2024 09:06 AM
Greetings -
We're currently interrogating the source systems providing the data that will populate the ServiceNow CMDB and estimate that there will be 800K or so assets to track.
Any thoughts/caveats/recommended approaches/lessons learned/best practices for loading this quantity of data?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-07-2024 09:30 AM
What is the source system?
1. If the source system provides a JDBC connector, consider using Data sources and transform maps.
2. If it supports REST API you can still create scripted Data sources and transform maps again.
3. Consider importing in batches.
4. Use multiple user account, why because ServiceNow has resource limitations at user level too. By using multiple user accounts you can run import jobs parallelly.
5. Search ServiceNow store if there are any connectors available for your system.
These are few from my experience.
Anvesh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-07-2024 10:15 AM - edited ‎08-07-2024 11:02 AM
Hi @dataWrangler ,
The better approach would be an import via Integration Hub ETL, but I have no experience with such large data amounts and how it could affect your instance
For importing millions of records, I use another architectural approach described at How to import 4 million records in 3 hours
if my answer has helped with your question, please mark my answer as accepted solution and give a thumb up.
Thanks & Regards,
Sumanth Meda