loading large xml file goes to 2% and dies. How can I increase memory for xml importing?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-08-2017 06:40 PM
loading large xml file goes to 2% and dies. How can I increase memory for xml importing?
- Labels:
-
Best Practices

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-08-2017 07:07 PM
Consider using Data Source and Import Sets for importing large sets of data.
If you are importing ServiceNow objects, consider using Update Sets.
User sessions have transaction time outs.
ServiceNow Nerd
ServiceNow Developer MVP 2020-2022
ServiceNow Community MVP 2019-2022
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-09-2017 06:09 AM
This is data only, not an update set.
I have imported data through various Excel spreadsheets, scrubbed the data and Exported the table data to an XML file to re-import into our Production instance. This particular table has 507000 records and is 1.6GB.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-08-2017 11:09 PM
Hi Cory,
Go to Remote Update sets and see if you have one with the same name as your local update set. Verify that the number of updates is the same number as your local update set. If all is good, then the solution is easy:
1. First get the SysID of the remote update set.
2. Then append that SysID and your instance name into the following URL and launch it:
If the remote update set is not available let me know and I can provide another solution basically emulating the Export XML script.
Thanks,
Aditya Telidevara
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-09-2017 06:09 AM
This is data only, not an update set.
I have imported data through various Excel spreadsheets, scrubbed the data and Exported the table data to an XML file to re-import into our Production instance. This particular table has 507000 records and is 1.6GB.