What is the most efficient approach for loading bulk data into ServiceNow

Todd O
Tera Guru

Dear community,

I would like to architect an approach for loading massive amounts of data into ServiceNow (mostly custom tables). When I say massive, I mean some of the tables will have millions of rows - even exceeding 80 million rows.   My data will NOT need transformation steps as I'm loading it "as is" into the tables.   The traditional approaches for loading data are taking way too long (e.g., data sources using a transformation map or web services).   It seems the transformation mapping adds overhead that really slows down the load process. Does anyone have experience or recommendations for best approach in loading massive amounts of data?

Here are a few of my ideas but I have yet to build a prototype for them. Keep in mind that I'm not concerned about coalescing for the initial load since they are all new records.

1. Leverage a web service and send umpteen records in one payload: This would entail have a Huge sized web service field (65,000 chars) and then send as many JSON records as possible in one payload to the web service. I would then have an onBefore script that would parse the JSON string into the umpteen records and do a direct insert into the table. It would then set ignore = true since I'm not leverage the web service to do any transformation or inserting of me.

2. Leverage a web service that raises event to delete the record insert: This is very similar to the above approach but instead of my onBefore script doing the inserting, it just raises umpteen events for each JSON object in the payload field and send the record over to the event. That way, the web service can get its job doing quickly and send back a response that it's finished right away. Then, the event will just chug on inserting records for as long as it needs. We can even assign it to run in a specific queue to separate it's processing.

Other thoughts, feedbacks, suggestions, concerns over these approaches or other ideas is very much appreciated. Thank you in advance.

Todd

1 ACCEPTED SOLUTION

DrewW
Mega Sage
Mega Sage

The transform maps that you are using, did you turn off running of business rules?



Also with the transform map which part is slow?   The loading of the data or the actual transform part?   I have seen our system do up to 200 or 300 records per second for something simple like you are talking.   But I have also seen it take a really long time to load the data before it even starts the transform do to the format.



Are your custom tables audited?   If so turn that off until the data is loaded.



The issue I see with your web service ideas is that the system will allow one transaction per user to be running at a time.   So I think you will run into that or you will burn up all the available semaphore that your instance has while you load the data.



Is it such a bad think that the data load takes awhile running in the background?   If you break it up into chunks, if thats even possible you could run several in the background at a time and not impact performance that much.


View solution in original post

5 REPLIES 5

DrewW
Mega Sage
Mega Sage

The transform maps that you are using, did you turn off running of business rules?



Also with the transform map which part is slow?   The loading of the data or the actual transform part?   I have seen our system do up to 200 or 300 records per second for something simple like you are talking.   But I have also seen it take a really long time to load the data before it even starts the transform do to the format.



Are your custom tables audited?   If so turn that off until the data is loaded.



The issue I see with your web service ideas is that the system will allow one transaction per user to be running at a time.   So I think you will run into that or you will burn up all the available semaphore that your instance has while you load the data.



Is it such a bad think that the data load takes awhile running in the background?   If you break it up into chunks, if thats even possible you could run several in the background at a time and not impact performance that much.


bryanbarnard
ServiceNow Employee
ServiceNow Employee

Definitely want to turn business rules off in the transform map as well as any auditing on the target tables as long as you don't need it.



Is this a one time bulk load or will it be a regularly recurring process?


It's a bulk load but then will have some subsequent delta loads. I'm getting good performance now with following the suggestions here. Thanks Bryan.


Hi Drew,


Thank you for the response. I took a bare bones approach by 1) putting my data into csv, 2) turning off business rules, 3) avoiding any onScripting, 4) having all my target files in the target table of simple String type. I then loaded 10,000 records as a trial and found that it will load 400 per second. This is a significant improvement and I thank you for suggestions.



Thanks again and I'll keep this thread posted with additional information that I uncover.


Todd