Is it possible to import multiple XML at once?

Dan51
Giga Contributor

Hello!

Before a clone I export my update sets to xml, save them in a folder, then once the clone has happened import the xml one by one. This seems like a bit of a waste of time and that we are adding a human layer in here which could be prone to error.

Is there a way of automating this so it selects all my files from a folder or selecting multiple XML files to import so they run sequentially?

 

Thanks a lot!

1 ACCEPTED SOLUTION

Ankur Bawiskar
Tera Patron
Tera Patron

Hi Dan,

Unfortunately you would require something custom for this i.e. 

1) setting up of mid server

2) write mid server script include which will fetch the xml files and upload in the instance

3) have a code similar to the one of applying update set i.e. preview, commit etc

So I think best method would be a human intervention to avoid any errors

Mark Correct if this solves your issue and also mark Helpful if you find my response worthy based on the impact.
Thanks
Ankur

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader

View solution in original post

17 REPLIES 17

Vikas singh
ServiceNow Employee
ServiceNow Employee

write a jUint test run it on the instance you want to load the xml files
inside jUint test in resource folder put all your xml files
don't use clear resource function in your code

Santi Cots
Tera Contributor

Hi,

 

After exporting the desired records using export xml, I use the following script to make a single file I can succesfully import:

 

#!/bin/sh

#Sample use: ./xmlmerge.sh './*.xml' > all.xml

files=($1) # create array with files
first=${files[0]} # get the 1st element of the array

head -n 2 $first # get 2 lines from the 1st element
for file in "${files[@]}"; do
tail -n +3 $file | grep -v '</unload>'
done
echo '</unload>'

Santi Cots
Tera Contributor

Hi,

 

After downloading the xml exported files, this is the script I succesfully use to merge them into a single file to import:

 

#!/bin/sh

#Sample use: ./xmlmerge.sh './*.xml' > all.xml

files=($1) # create array with files
first=${files[0]} # get the 1st element of the array

head -n 2 $first # get 2 lines from the 1st element
for file in "${files[@]}"; do
tail -n +3 $file | grep -v '</unload>'
done
echo '</unload>'

 

 

Chris H2
Giga Guru

Hello Dan,

 

I came across your post when searching for a solution to a different problem. In case anyone else comes across this, there is a very easy solution to your specific use case. Hopefully:

 

A) You solved this problem a very long time ago, and;

B) This solution can help others who now come across your post with the same issue

 

The solution is: create an Update Set to use as a batch base (e.g. "Dan's Pre-Clone Preserved Update Sets"), then populate the Parent field of all the Update Sets you would like to preserve with this batch parent. You may then export all Update Sets to a single XML, which can be imported all in one action as well once the clone is completed.

Documentation for this feature is here: https://docs.servicenow.com/bundle/tokyo-application-development/page/build/system-update-sets/hier-...


Kind regards,

Chris

Jason Siegrist
Giga Guru

I was hoping I could find a way to easily merge to xmls together and add them as one ... In fact I am looking to merge about 108 different xml data loads