Query related to Fix script
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-21-2023 06:31 AM
Hi All,
I have a requirement to scan a table and create records in new table. Records are huge in number close to 10 lac.
I have created fix script and tested for 500 records using set limit function logic is working fine. I ran it without set limit to update all records but it says after one or two hours memory is fully depleted.
Can someone help me how to run fix script for 10 lac records.
1. Is it possible to run script in one go?
2.If we run in chunks will it take next records? e.g. I update 500 records . Will it start from 501 in next run?
Thanks,
Tara Singh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-21-2023 06:34 AM
Hello @Tarasingh26
Plz refer the below link for solving your query :-
Plz mark my solution as Accept, If you find it helpful.
Regards,
Samaksh

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-21-2023 06:35 AM
Hi,
10lac is a big number. Unsure, why you need to update these many records. However, suggestion is to apply setLimit() for may be some 50k records.
You can instead convert it to scheduled job that updates 50k records and repeats itself after 30 odd mins or so automatically.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-21-2023 06:46 AM
@Tarasingh26 Here is a nice article on how batch processing can be done on such huge number of records https://snprotips.com/blog/2018/10/11/how-to-do-massive-slow-database-operations-without-slowing-dow...
Hope this helps.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-21-2023 06:51 AM
Hi @Tarasingh26 ,
Not sure what is the requirement for which u need to scan this huge records. But coming to the solution.
Executing a script for these huge no of records at a single time would not be a good idea.
Try to run it in batches by changing the filter lets say like created between 1 Jan 2023 to 31 March 2023, then next filter 1st April to something n so on.
Make sure you perform this activity during off business hours so that if there is any impact such as slowness to the instance will not be noticeable by others.
Thanks,
Danish