loading large xml file goes to 2% and dies. How can I increase memory for xml importing?

Cory Miller
Giga Expert

loading large xml file goes to 2% and dies. How can I increase memory for xml importing?

8 REPLIES 8

The SN Nerd
Giga Sage
Giga Sage

Consider using Data Source and Import Sets for importing large sets of data.



If you are importing ServiceNow objects, consider using Update Sets.



User sessions have transaction time outs.



ServiceNow Nerd
ServiceNow Developer MVP 2020-2022
ServiceNow Community MVP 2019-2022

This is data only, not an update set.


I have imported data through various Excel spreadsheets, scrubbed the data and Exported the table data to an XML file to re-import into our Production instance. This particular table has 507000 records and is 1.6GB.


Aditya Telideva
ServiceNow Employee
ServiceNow Employee

Hi Cory,


Go to Remote Update sets and see if you have one with the same name as your local update set.   Verify that the number of updates is the same number as your local update set.   If all is good, then the solution is easy:


1. First get the SysID of the remote update set.


2. Then append that SysID and your instance name into the following URL and launch it:


https://INSTANCE-NAME.service-now.com/export_update_set.do?sysparm_sys_id=REMOTE-UPDATESET-SYSID&sys...



If the remote update set is not available let me know and I can provide another solution basically emulating the Export XML script.


Thanks,


Aditya Telidevara


This is data only, not an update set.


I have imported data through various Excel spreadsheets, scrubbed the data and Exported the table data to an XML file to re-import into our Production instance. This particular table has 507000 records and is 1.6GB.