Copying Large Data Tables Across Environments

jmiskey
Kilo Sage

We have a custom table in our Production environment.  It is in a Scoped Application, extends the Task table, and has quite a number of fields (128, to be exact).  

 

I need to copy the current production data from this table down to our lower environments, so I can test out some scripts that need to be run against it.  There are currently just about 1400 records in this table.  If I export them to an XML file, the size of the file is about 175,000 KB and has about 195,000 rows in it (it took me computer over 10 minutes just to open the size so I could get a row count!).

 

So I tried going to this table in our DEV environment, right-clicking, and selecting Import XML.  After a few minutes, I get the pop-up box saying this is taking longer than expected and asking me if I would like to wait, and I select "Yes".  After a few minutes, I am able to move around again.  However, I do not see all the records I tried to import.  I though maybe it is running in the background, and to wait a while.  However, it has been over a half-hour now, and the new records still are not there.  And I do not know of any good way to check/confirm if it is still running, or if there were any sort of errors.

 

Am I going about this in the best manner?  Optimally, the surest way would probably be to do a clone down from Prod to Dev, but we recently did that, and there are a lot of new projects being worked on in Dev, so we are not really at a point where we can do that right now.  Is there an option to just copy/clone down a table?

 

Just wondering what our options are for getting a fresh copy of our data down to Dev.

 

Thanks

7 REPLIES 7

Dr Atul G- LNG
Tera Patron
Tera Patron

Might be helpful 

https://www.servicenow.com/community/developer-forum/most-efficient-way-to-transfer-tables-between-t...

 

Precison bridge is good way.

*************************************************************************************************************
If my response proves useful, please indicate its helpfulness by selecting " Accept as Solution" and " Helpful." This action benefits both the community and me.

Regards
Dr. Atul G. - Learn N Grow Together
ServiceNow Techno - Functional Trainer
LinkedIn: https://www.linkedin.com/in/dratulgrover
YouTube: https://www.youtube.com/@LearnNGrowTogetherwithAtulG
Topmate: https://topmate.io/atul_grover_lng [ Connect for 1-1 Session]

****************************************************************************************************************

The only thing I see "new/different" in that response is the reference to Precision Bridge, and I don't think that is going to be a viable solution for us.  It takes months to get new software/applications approved within our company, and I really don't have that kind of time.  I am not sure that they would approve something like that to copy over just one table anyway.

 

Thanks anyway.  Maybe I will see if I can limit the records to export based on Updated Date (export just the ones updated since the last clone).  Maybe that file size will be small enough that the export/import method will work.

Hi @jmiskey 

Thanks for update. 

 

Got the point.

 

Yes, limiting record  or divide based on different filter condition so that max 10 k records go in one shot.

*************************************************************************************************************
If my response proves useful, please indicate its helpfulness by selecting " Accept as Solution" and " Helpful." This action benefits both the community and me.

Regards
Dr. Atul G. - Learn N Grow Together
ServiceNow Techno - Functional Trainer
LinkedIn: https://www.linkedin.com/in/dratulgrover
YouTube: https://www.youtube.com/@LearnNGrowTogetherwithAtulG
Topmate: https://topmate.io/atul_grover_lng [ Connect for 1-1 Session]

****************************************************************************************************************

Hmmm... for some reason, that did not seem to have the intended result.  By adding that criteria, I cut the number of records being exported from 1403 to 95 (quite significant!).  However, when I exported those 95 records to an XML file, the size of the resulting file is still 120,000 KB, as compared to the to the original 175,000 KB in my original one. 

 

Not sure why this is.  I would have expected a much bigger reduction in size!  Maybe these newer records have more data than the older ones, but I would not expect that much more.  So I am still stuck with a large file that it seems I am unable to do much with.

 

Back to the drawing board!