- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tuesday
I'm exploring efficient ways to transfer millions of rows of table data to an external system and would appreciate your insights. What approaches have you found effective for handling such large-scale data transfers without compromising performance? I'm currently considering REST APIs as a potential solution, but I'm open to other suggestions. If possible, please share your experiences or recommendations in detail.
Thanks in advance!
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tuesday
As mentioned, in our integration third-party system was sending the data to MID Server and we were processing the file before we import to ServiceNow instance. We did this manually by splitting the CSV to 5 different files and loaded in import set tables and later to target tables. This approach was for initial data load and delta loads were handled as part of MID Server sending the data to instance on periodic basis and records would be inserted/updated
If you do not have Integration Hub license, I would recommend to go with export set as it is tried and tested method and would work fine
I hope you appreciate the efforts to provide you with detailed information. If my response helped to guide you or answer your query, please mark it helpful & accept the solution.
Thanks,
Bhuvan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tuesday
Thanks Bhuvan, this is really insightful. It’s good to know that you’ve successfully handled ~4.5M records using the Import Set/MID Server approach.
Is the CSV splitting something that has to be done manually, or did you automate it with scripts/tools? Just wondering if there’s a recommended way to make that part less complex when dealing with millions of records.
On the Data Stream option you mentioned — it requires an additional separate IntegrationHub license I believe.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tuesday
As mentioned, in our integration third-party system was sending the data to MID Server and we were processing the file before we import to ServiceNow instance. We did this manually by splitting the CSV to 5 different files and loaded in import set tables and later to target tables. This approach was for initial data load and delta loads were handled as part of MID Server sending the data to instance on periodic basis and records would be inserted/updated
If you do not have Integration Hub license, I would recommend to go with export set as it is tried and tested method and would work fine
I hope you appreciate the efforts to provide you with detailed information. If my response helped to guide you or answer your query, please mark it helpful & accept the solution.
Thanks,
Bhuvan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Wednesday
Ok, thanks for the response. Much Appreciated. 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Wednesday