Join the #BuildWithBuildAgent Challenge! Get recognized, earn exclusive swag, and inspire the ServiceNow Community with what you can build using Build Agent.  Join the Challenge.

Optimize fix script to run more efficiently

mkader
Kilo Guru

Hello,

I have a fix script that copies records from one table and inserts to another table. There are over 500,000 records being copied over. The fix script works fine, however, when I run the query it takes a very long time to run and will often timeout. In order for me to fully execute the fix script, I must run it several times. This is just a one off so we want to do this using a fix script. Below is the script I have written. 

var gr = new GlideRecord('u_table_1');
gr.query();
while (gr.next()){
    var gr1 = new GlideRecord('u_table_2');
    gr1.addQuery('u_description',gr.u_description);
    gr1.addQuery('u_short_description',gr.u_short_description);
    gr1.addQuery('u_name',gr.u_name);
    gr1.query();
    if(gr1.next())
{
//do nothing
}
//if not found then insert
else{
    gr1.initialize();
    gr1.u_description = gr.u_description;
    gr1.u_short_description = gr.u_short_description;
    gr1.u_name= gr.u_name;
    gr1.insert();
}
}

Is there maybe a way to segment the fix script so that it runs and completes much quicker?

Thanks!

1 ACCEPTED SOLUTION

Try this. 

var gr = new GlideRecord('u_table_1');
gr.query();

while (gr.next()){
    var gr1 = new GlideRecord('u_table_2');
    gr1.addQuery('u_description',gr.u_description);
    gr1.addQuery('u_short_description',gr.u_short_description);
    gr1.addQuery('u_name',gr.u_name);
    gr1.setLimit(1);
    gr1.query();

    if(!gr1.hasNext()){
        gr1.initialize();
        gr1.u_description = gr.u_description;
        gr1.u_short_description = gr.u_short_description;
        gr1.u_name= gr.u_name;
        gr1.setWorkflow(false); //stop execution of Business rule or other scripts.
        gr1.insert();
    }
}

If it stops in 5 mins then try running in background script then check logs if it throws any other issues. Also, can you confirm - both the tables are in same scope? 

Regards,
Muhammad

View solution in original post

27 REPLIES 27

@Fabian Kunzke - I am currently running it in the background, but it still fails. This is the error I was getting 

asifnoor
Kilo Patron

Hi,

500,000 records copying will take time and it is not recommended that you do this at once. I suggest that you do this in chunks, may be 1000 per run and run this in scheduled job to run every 5 mins.

Also, create a field in your source table copied and whenever you ocpy that record, update thtat to true. Find below the code.

 

var gr = new GlideRecord('u_table_1');
gr.addQuery("copied","!=","true");
gr.setLimit(1000);
gr.query();
var gr1 = new GlideRecord('u_table_2');
while (gr.next()){
    gr1.initialize();
    gr1.u_description = gr.u_description;
    gr1.u_short_description = gr.u_short_description;
    gr1.u_name= gr.u_name;
    gr1.insert();
    gr.copied="true";
    gr.update();
}

Mark the comment as a correct answer and also helpful if it helps.

Hi,

For a simple and efficient solution, kindly use the approach that i have suggested. This way you don't need to worry how many are copied and how many are left out as we are tracking them using a separate column. You can delete the column once the requirement is done.

Mark the comment as a correct answer and helpful if it helps.

Abdul Azeez
Mega Guru

Hi,

 

You can run a script when you know you have 500,000 records. I would suggest you to include setLimit(20000); which updates  20K records.

 

Other simplest way is after appliying the query to the list export it to Excel and import to your target table u_table_1. Simple as easy to do 🙂

 

Let me know if you need help on Import sets

 

Please mark it as helpful or correct based on the impact

Abdul Azeez

@Abdul Azeez - If I do the setLimit(20000), that means I will need multiple fix scripts? how can I loop through to have it run the next batch of records? Can you please provide me an example?