Optimize fix script to run more efficiently

mkader
Kilo Guru

Hello,

I have a fix script that copies records from one table and inserts to another table. There are over 500,000 records being copied over. The fix script works fine, however, when I run the query it takes a very long time to run and will often timeout. In order for me to fully execute the fix script, I must run it several times. This is just a one off so we want to do this using a fix script. Below is the script I have written. 

var gr = new GlideRecord('u_table_1');
gr.query();
while (gr.next()){
    var gr1 = new GlideRecord('u_table_2');
    gr1.addQuery('u_description',gr.u_description);
    gr1.addQuery('u_short_description',gr.u_short_description);
    gr1.addQuery('u_name',gr.u_name);
    gr1.query();
    if(gr1.next())
{
//do nothing
}
//if not found then insert
else{
    gr1.initialize();
    gr1.u_description = gr.u_description;
    gr1.u_short_description = gr.u_short_description;
    gr1.u_name= gr.u_name;
    gr1.insert();
}
}

Is there maybe a way to segment the fix script so that it runs and completes much quicker?

Thanks!

1 ACCEPTED SOLUTION

Try this. 

var gr = new GlideRecord('u_table_1');
gr.query();

while (gr.next()){
    var gr1 = new GlideRecord('u_table_2');
    gr1.addQuery('u_description',gr.u_description);
    gr1.addQuery('u_short_description',gr.u_short_description);
    gr1.addQuery('u_name',gr.u_name);
    gr1.setLimit(1);
    gr1.query();

    if(!gr1.hasNext()){
        gr1.initialize();
        gr1.u_description = gr.u_description;
        gr1.u_short_description = gr.u_short_description;
        gr1.u_name= gr.u_name;
        gr1.setWorkflow(false); //stop execution of Business rule or other scripts.
        gr1.insert();
    }
}

If it stops in 5 mins then try running in background script then check logs if it throws any other issues. Also, can you confirm - both the tables are in same scope? 

Regards,
Muhammad

View solution in original post

27 REPLIES 27

Hi,

First of all make an estimate how much time it will take to process 2lakh records.

Say 4 hours

Then Best thing create a schedule job run which will run after 5 hours doing the same thing.

This way you don't need to take care to go & press the fix script button.

Disable the script after everything copied

Like i followed the same approach when i have to delete 15 lakh record.I did same create a scheduled job & did it.

 

Thanks

Sudhanshu

MrMuhammad
Giga Sage

Hi @mkader,

Why don't you increase transaction quota rule for fix script? OOB its 4 hrs (14,400 sec) and once you reach the time limit Fix script stops and give timeout errod. 

As this is one time job, I would suggest increasing the maximum duration. That way you will save running the job in chunks or creating multiple jobs.

find_real_file.png


Another approach we typically use is to run jobs in chunks e.g you can set the query for table 1 glide record something like. All the records created between 08/2018 & 08/2019. in the next iteration, your query will be 09/2019 & 08/2020 and so on but this will require running fix script multiple times or writing multiple jobs. 

I would recommend going for the first option that will save you a lot of effort and time. 

 

Thanks & Regards,

Sharjeel

Regards,
Muhammad

It takes under 10 mins from my fix script to complete before failing/timeout. So I am not sure if this is the right thing? 

Have you tried running it in the background?

find_real_file.png

When you say complete, you mean it took under 10 mins to insert 500,000 records in table 2? 

Regards,
Muhammad