- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2022 06:45 AM
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2022 08:19 AM
Hello @basantsoni ,
You can do it in three ways when it comes to bulk load or bulk deletion
1) Background script
2)Scheduled job
3)Fix script
BACKGROUND SCIRPT : https://developer.servicenow.com/blog.do?p=/post/training-scriptsbg/
refer the above link to understand how useful a background script can be
Schedule Job : You can run it on demand by clicking on execute now or you can run this script at a specific selected time with out any continuous monitoring
You can automate the following kinds of tasks:
- Automatically generate and distribute a report
- Automatically generate and schedule an entity of records, such as an incident, change item, configuration item, from a template
- Run scheduled jobs from scripts or business rules
- Scheduling at the end of the month
- Scheduling for weekdays
Fix scripts :
A fix script is server-side JavaScript code that runs after an application is installed or upgraded.
Include fix scripts to make changes that are necessary for the data integrity or product stability of an application.
After you transfer an application to another instance, you can manually run any custom fix scripts you created. You can also install or repair the associated application to automatically run its associated fix scripts. Upgrading the instance also runs fix scripts.
But one thing to keep in mind is use any one of these scripts but filtering the records is very important .
lets say you are updating or deleting some million of records in incident table which is huge in number but if you are deleting every thing at a time it might take lot of time as the number is huge .
But if you divide the records and split it and run the script in batches it will take less time as you are deleting or updating the records in chunks but not at a time .
So always try to delete or update or insert records in chunks but not at a time or if you are updating few records filter only those records so that you dont have to query while table .Just build a encoded query which will return only wanted records to be updated or deleted from the table which will avoid processing of all the records in the table .
This is the reason it took less time in your scheduled job as you used encoded query
Hope this helps
Mark my answer correct if this helps you
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2022 08:19 AM
Hello @basantsoni ,
You can do it in three ways when it comes to bulk load or bulk deletion
1) Background script
2)Scheduled job
3)Fix script
BACKGROUND SCIRPT : https://developer.servicenow.com/blog.do?p=/post/training-scriptsbg/
refer the above link to understand how useful a background script can be
Schedule Job : You can run it on demand by clicking on execute now or you can run this script at a specific selected time with out any continuous monitoring
You can automate the following kinds of tasks:
- Automatically generate and distribute a report
- Automatically generate and schedule an entity of records, such as an incident, change item, configuration item, from a template
- Run scheduled jobs from scripts or business rules
- Scheduling at the end of the month
- Scheduling for weekdays
Fix scripts :
A fix script is server-side JavaScript code that runs after an application is installed or upgraded.
Include fix scripts to make changes that are necessary for the data integrity or product stability of an application.
After you transfer an application to another instance, you can manually run any custom fix scripts you created. You can also install or repair the associated application to automatically run its associated fix scripts. Upgrading the instance also runs fix scripts.
But one thing to keep in mind is use any one of these scripts but filtering the records is very important .
lets say you are updating or deleting some million of records in incident table which is huge in number but if you are deleting every thing at a time it might take lot of time as the number is huge .
But if you divide the records and split it and run the script in batches it will take less time as you are deleting or updating the records in chunks but not at a time .
So always try to delete or update or insert records in chunks but not at a time or if you are updating few records filter only those records so that you dont have to query while table .Just build a encoded query which will return only wanted records to be updated or deleted from the table which will avoid processing of all the records in the table .
This is the reason it took less time in your scheduled job as you used encoded query
Hope this helps
Mark my answer correct if this helps you
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2022 09:10 AM - edited 11-20-2022 09:11 AM
Hi @basantsoni ,
Scheduled job (script execution) always run in the background and run server-side code. As name suggest it can be scheduled as per requirement and can be run OnDemand as well. Here we can set run as (any user) field.
Fix scripts are the server-side code that can be run in background or foreground as per need. This can only be executed onDemand. This will always run as logged in user.
Please mark the answer as correct, If I answered your query. It will be helpful for others who are looking for similar questions.
Regards
Saurabh
Thanks and Regards,
Saurabh Gupta