How to process large number of records in batch in scheduled jobs

spimple
Tera Contributor

I am querying a user group I want to get the users who last logged in 60 days before and i want to get those and remove them either by scheduled job script or Flow designer. but the catch is i need to process the records in batches like if there are 5000 records it should make for example batches of 1000 first and then again 1000.
How do i achieve it ?
Need to query a Group first then all the users in it .. and after that I need to verify if user hasn't logged in since 60 days and when i get those users. I need to remove those. Please tell me how to approach

4 REPLIES 4

Gummagatta Hars
Mega Guru

Hi @spimple ,

 

You can create a scheduled job to remove the inactive members from group using the script. We cannot use limit while using the deleteMultiple(). Below is the script for removing users from groups who have not logged in to ServiceNow in last 60 days.

 

 

var grMember = new GlideRecord('sys_user_grmember');
grMember.addEncodedQuery('user.last_loginRELATIVELT@dayofweek@ago@60');
grMember.query();
grMember.deleteMultiple();

 

Let me know in case if you have any further queries.

 

If the solution shared works for you, please "Accept as Solution" and mark " Helpful." 

 

Thanks,

Gummagatta Harshavardhana

Hello Yes we can do that but my question is if there are large number of records lets say 5000 if it process it at one time there will be performance issue so what i want is i want to process records in batches. like it will take 1000 first then the remaining ones.

How do i achieve that

David94
Tera Contributor

Hi spimple,
This is probably no longer helpful to you, but I will pop the answer here to help others in the future. You can use the setlimit() function to create batches for your script.

 

var grMember = new GlideRecord('sys_user_grmember');
grMember.addEncodedQuery('user.last_loginRELATIVELT@dayofweek@ago@60');
grMember.setLimit("1000");
grMember.query();
grMember.deleteMultiple();

 


Thanks,
David

 

Hi David, 

 

Will that only process 1000 records and then finish, or it will go and delete 5 batches of 1000 records (following the example of a total of 5000 CIs).

 

Thanks