Bulk data transfer to External System

Jesufair J
Tera Contributor

I'm exploring efficient ways to transfer millions of rows of table data to an external system and would appreciate your insights. What approaches have you found effective for handling such large-scale data transfers without compromising performance? I'm currently considering REST APIs as a potential solution, but I'm open to other suggestions. If possible, please share your experiences or recommendations in detail.

Thanks in advance!

9 REPLIES 9

Thanks Palani.

I agree that running REST APIs for millions of records could create performance issues, and adding wait times/batching is a good safeguard. Have you ever tried? Will it work for large set of data?

I’ll also explore the Instance Data Replication option you mentioned to see if it fits our use case. Feel free to share if have any resource if handy. 🙂

Thanks Rafael. One quick question, is XML export/import practical for millions of rows, or is it generally better suited for smaller datasets/configuration data? 
ServiceNow may run into memory and performance issues when exporting/importing multi-million rows as XML.
What I know XML export/import in ServiceNow is primarily used for configuration data, updates, or smaller datasets. Will it not?

Bhuvan
Kilo Patron

@Jesufair J 

 

We have done import of ~4.5 million records from a third-party system to ServiceNow using import set via MID Server. Export set would work similarly and you would need to export file from ServiceNow to MID Server which in turn would be picked up by third-party.

 

https://www.servicenow.com/docs/bundle/zurich-integrate-applications/page/administer/export-sets/con...

 

This operation takes hours and would recommend you to test the maximum records that can be exported in a single transaction. We tested bulk import using many methods and on analysis, we noticed when number of records cross ~950,000 we run into issues. So we split the CSV file to max 900,000 records and imported it. Do similar exercise in Development environment to simulate the number of records to be exported. Schedule this activity during offline hours so that there are no resource bottlenecks during this export set.

 

Another option you can try to use is data stream. Typically we use data stream to process large number of records from external system to ServiceNow. See if it is possible other way around and third-party can consume data from ServiceNow.

 

https://www.servicenow.com/community/developer-blog/process-large-data-sets-smoothly-with-flow-desig...

 

https://www.servicenow.com/docs/bundle/zurich-integrate-applications/page/administer/integrationhub/...

 

As per community guidelines, you can accept more than one answer as accepted solution. If my response helped to answer your query, please mark it helpful & accept the solution.

 

Thanks,

Bhuvan

Thanks Bhuvan, this is really insightful. It’s good to know that you’ve successfully handled ~4.5M records using the Import Set/MID Server approach.

Is the CSV splitting something that has to be done manually, or did you automate it with scripts/tools? Just wondering if there’s a recommended way to make that part less complex when dealing with millions of records.

On the Data Stream option you mentioned — it requires an additional separate IntegrationHub license I believe.