Exporting more than 250.000 XML records from a table (Export limit)

cicgordy
Tera Guru

Hi all, 

I am trying to export over 250.000 xml records from a table. System properties (export limit) is only set to a default of 10.000 record to export maximum. Can I change and, if so, how do I change the xml export limit to over 250.000? Will the system allow me to do this or will it run into issues by setting such a high number? 

Thanks 🙂

3 ACCEPTED SOLUTIONS

Mark Manders
Mega Patron

Even if you get this to work, you will run into issues when loading them back into this (or another) instance, because of the size of the XML. Best way to move on is to do it in batches. It will make sure you can use them again. 

But yes: you can change the default to a higher volume, only do know that these are in place for a reason. It can cause performance issues when you are exporting these kinds of volumes.


Please mark any helpful or correct solutions as such. That helps others find their solutions.
Mark

View solution in original post

Hi Mark, thanks for the quick response, I understand. When you say the best way is to move them in batches, do you mean through update set? if so, will I have to select and add all the 250.000 records manually to an update set? Thanks

 

View solution in original post

Mark Manders
Mega Patron

No, I mean splitting the records up, so you will end up with several XML's. And you should check the size of the XML once you download it. We had an issue once with a customer that attached a lot of video evidence to stories they were testing. Exporting the stories for clone preparation (stories were kept on DEV only) was no issue (number of stories wasn't that high), but because of the attachments the exports became too large to reload them to the instance. That took a lot of time to split the xml, size it correct and eventually imported them again.

 


Please mark any helpful or correct solutions as such. That helps others find their solutions.
Mark

View solution in original post

4 REPLIES 4

Mark Manders
Mega Patron

Even if you get this to work, you will run into issues when loading them back into this (or another) instance, because of the size of the XML. Best way to move on is to do it in batches. It will make sure you can use them again. 

But yes: you can change the default to a higher volume, only do know that these are in place for a reason. It can cause performance issues when you are exporting these kinds of volumes.


Please mark any helpful or correct solutions as such. That helps others find their solutions.
Mark

Hi Mark, thanks for the quick response, I understand. When you say the best way is to move them in batches, do you mean through update set? if so, will I have to select and add all the 250.000 records manually to an update set? Thanks

 

Mark Manders
Mega Patron

No, I mean splitting the records up, so you will end up with several XML's. And you should check the size of the XML once you download it. We had an issue once with a customer that attached a lot of video evidence to stories they were testing. Exporting the stories for clone preparation (stories were kept on DEV only) was no issue (number of stories wasn't that high), but because of the attachments the exports became too large to reload them to the instance. That took a lot of time to split the xml, size it correct and eventually imported them again.

 


Please mark any helpful or correct solutions as such. That helps others find their solutions.
Mark

Another question, how do you make sure you don't miss any record by splitting the records up? The system selects the first 10.000 records to export. How then do you make sure and export the next ones correctly without overwriting or missing any record? Hope I make sense. Thanks