Bulk data transfer to External System

Jesufair J
Tera Contributor

I'm exploring efficient ways to transfer millions of rows of table data to an external system and would appreciate your insights. What approaches have you found effective for handling such large-scale data transfers without compromising performance? I'm currently considering REST APIs as a potential solution, but I'm open to other suggestions. If possible, please share your experiences or recommendations in detail.

Thanks in advance!

8 REPLIES 8

Ankur Bawiskar
Tera Patron
Tera Patron

@Jesufair J 

you can use Export Set feature and place the file (excel, csv etc) to mid server.

Ask the customer's team to grab that file from mid server location.

You can schedule your export and configure the table, field, frequency etc for the push

Export sets 

OR

Expose a Scripted REST API and endpoint and ask 3rd party to pull the information based on your filter conditions for the table

If my response helped please mark it correct and close the thread so that it benefits future readers.

Regards,
Ankur
✨ Certified Technical Architect  ||  ✨ 9x ServiceNow MVP  ||  ✨ ServiceNow Community Leader

Thanks Ankur for the reply.

For the Export Set via MID Server option, since you mentioned exporting as a CSV or file, is it practical to handle millions of rows in a single file? Or would you recommend splitting into smaller files for better performance and reliability?

In your experience on Scripted REST API approach, how does it scale for very large datasets (millions of rows)? Would you recommend streaming/pagination over returning bulk payloads to avoid timeouts?

palanikumar
Mega Sage

Hi,

Running REST API for millions of records may impact performance of the platform. You should have wait time in between (e.g. every 1000 records) to avoid overloading the system. 

Also there are features like Instance Data Replication which are designed to avoid performance issue. You can also check whether this can be used

Thank you,
Palani

Thanks Palani.

I agree that running REST APIs for millions of records could create performance issues, and adding wait times/batching is a good safeguard. Have you ever tried? Will it work for large set of data?

I’ll also explore the Instance Data Replication option you mentioned to see if it fits our use case. Feel free to share if have any resource if handy. 🙂