How to Delete duplicate above 200000 group members at a time
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-23-2025 06:20 AM - edited 07-23-2025 06:25 AM
I want to Delete above 200000 duplicate group members in many groups at a time by using background script without setlimit.
Is that possible...?
Thanks in advance
Nani
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-23-2025 07:21 AM
Hi @Nani18
might be you will get an idea.
If my response proves useful, please indicate its helpfulness by selecting " Accept as Solution" and " Helpful." This action benefits both the community and me.
Regards
Dr. Atul G. - Learn N Grow Together
ServiceNow Techno - Functional Trainer
LinkedIn: https://www.linkedin.com/in/dratulgrover
YouTube: https://www.youtube.com/@LearnNGrowTogetherwithAtulG
Topmate: https://topmate.io/atul_grover_lng [ Connect for 1-1 Session]
****************************************************************************************************************
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-23-2025 06:37 AM
@Nani18 100 % agree with Atul ☝️
Instead of deletion, rather think to archive the data some time (a year or two) after the deactivation.
/* If my response wasn’t a total disaster ↙️ ⭐ drop a Kudos or Accept as Solution ✅ ↘️ Cheers! */
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-23-2025 06:35 AM
Hi @Nani18
for this purposes use Update jobs:
It is no-code method with rollback option.
- Select a table,
- give it conditions and you will see how many records are matching that condition,
- select what fields shall be updated - make it Active to FALSE.
- it is better to deactivate as that user can be referenced in any of the records (Inc, KB article, etc).
I don't recommend update such a big amount of records at once, do it in bulks (by thousands and executed one after another) in early mornings or late night hours.
I recommend to uncheck these two options:
- Auto updating system fields
- if true: it will update the "updated on" and "updated by" fields which can hold some useful detail,
- applied on all the updated record with the time of the execution and updated by "system",
- if false: it will make the job but not update these two fields - recommended.
- Run BR and engines
- an equivalent of setWorkflow(false);
- unless necessary I also recommend to uncheck this.
- As user deactivation can lead to trigger notification or any other recursive actions.
Alternatively, you can do it via fix script, it has the rollback option also. Or a scheduled job.
Let me know what do you think about the Update jobs.
/* If my response wasn’t a total disaster ↙️ ⭐ drop a Kudos or Accept as Solution ✅ ↘️ Cheers! */
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-23-2025 08:46 AM
check this blog and see how to get the duplicates based on 2 columns i.e. User + Group in sys_user_grmember and then validate and delete
I created this blog 6 years ago
Search for Duplicates & Delete Based on 2 Columns
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-23-2025 11:36 PM
Hope you are doing good.
Did my reply answer your question?
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader