Bestpractice of inserting huge number of records

Makoto Uchiyam1
Tera Contributor

Hi experts,

 

I would like to perform a record insert on another table based on information from one table in Servicenow instance. There are more than 10 million records that need to be created, and if I were to do this with the following simple script, it would take more than 100 hours in total. Please advise on how to speed this up.

I couldn't find GlideRecord API to insert multiple records at once.

 

var grInput = new GlideRecord("<Input Table>");
grInput.addQuery(<Condition>);
grInput.query();
while (grInput.next()) {
    var grOutput = new GlideRecord("<Output Table>");
    grOutput.initialize();
    // Set parameters
    grOutput.xxx = gr.Input.xxx;
    grOutput.yyy = grInput.yyy;
    grOutput.zzz = grInput.zzz;
    grOutput.insert();
    }
}

 Regards,

1 ACCEPTED SOLUTION

Unfortunately (at least to my knowledge) there is no "batch mode" for background scripts. Especially because of the rollback function it would advice to split the source data.

Note: As long as you intend to use a background script (for "rollbackability"), there is no way for you to speed this up, as every transaction is executed by a user. The only workaround is - as i mentioned - to fire up the background scripts using multiple sessions.

 

Theoretically you could also try the following if you have a integration hub professional subscription:

Create a Data Source which uses Data Stream Source (this would support pagination) and then configure the data source to do the splitting for you. 

Are you able to create such a data stream in the flow designer? It would look something like this:

MarkusKraus_0-1684489204853.png

 

View solution in original post

5 REPLIES 5

Markus Kraus
Kilo Sage

I'd suggest you to split the source data into exclusive, non-overlapping blocks and fire up the script in multiple session (e.g. login using different browser personas / different browsers / incognito mode).

This way you can have the script run in "parallel".

Hi Markus,

 

Thaks for your fast reply!

I understand your suggestion, but it needs a lot of manual work.

I want to reduce manual work as much as possible. I plan to use "Scripts - Background" because I rely on Rollback function of that. Do you know how to use the "Scripts - Background" function itself in batch mode with scripting?

Unfortunately (at least to my knowledge) there is no "batch mode" for background scripts. Especially because of the rollback function it would advice to split the source data.

Note: As long as you intend to use a background script (for "rollbackability"), there is no way for you to speed this up, as every transaction is executed by a user. The only workaround is - as i mentioned - to fire up the background scripts using multiple sessions.

 

Theoretically you could also try the following if you have a integration hub professional subscription:

Create a Data Source which uses Data Stream Source (this would support pagination) and then configure the data source to do the splitting for you. 

Are you able to create such a data stream in the flow designer? It would look something like this:

MarkusKraus_0-1684489204853.png

 

Thanks for your reply.

 

I understand there is no way to execute "Scripts - Background" in batch mode.

Im interested in your proposal that uses Data Stream. It looks like a data stream action is available in our environment, but I have never used it. 

 

1. Am I correct in my understanding that the data stream will meet the following requirements of mine?

・Can migrate record from one table to another table closed in one Instance.

・Data can be rolled back with some granularity when errors occur

・Entire split migration sessions can be run in batch

 

2. This approach appears to require an external DB server, is that correct?

 

Regards,