- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-08-2016 10:46 AM
Hi all.
Is there a better / more recommended way of dealing with large queries other than just glide record searches? I wrote a custom script to query a table of tasks. After the initial run it would only be dealing with small numbers but because I'm retroactively updated the records that were already created as well my initial run has to go through basically every single record in the table. Is there a recommended practice to go through large queries like this? Do I just have to break it up and manually run it multiple times on each piece until I get all the old records? Any suggestions are appreciated.
Thanks.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-08-2016 11:05 AM
You could use setLimit to break it up, if you want. You could put it in a scheduled job to run every night for a certain number of nights to reduce load on the system. You could add specific conditions to try to get your set down to something smaller. You could engage HI support to see if they are willing to run this on the database level, which would make it much quicker. There are options, just depends on how long you want to wait.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-08-2016 11:05 AM
You could use setLimit to break it up, if you want. You could put it in a scheduled job to run every night for a certain number of nights to reduce load on the system. You could add specific conditions to try to get your set down to something smaller. You could engage HI support to see if they are willing to run this on the database level, which would make it much quicker. There are options, just depends on how long you want to wait.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-29-2016 08:38 AM
I just set up a scheduled job to run overnight. Actually I set up several jobs each with it's own set of queries. It met my needs. When I asked the question I was just curious if there was some magical way of doing it I wasn't aware of and I am now confident the answer is no not really. Just do it outside of working hours so when it makes everything slow no one is around to notice.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-29-2016 09:05 AM
I have seen this handled in a couple of ways. Option #1 is what I would recommend if possible.
1. Can you use REST API to return a list of the sys_id for the records you are trying to query for and then iterate through using gs.get to only get those records and make whatever updates are needed on the record?
2. Can you store your glideQuery results as sys_id in a JS associative array? I have seen these used mainly in conjunction with transform jobs but could see it being useful in other places as well.
-JB
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-29-2016 09:28 AM
Using get wouldn't really change anything because the first run will have to populate the fields that are new for every single record. But it may be more effective for successive runs where only a small portion of the records need updated. I haven't tried to do that. However after the first run that does change every single record the following ones aren't nearly as slow anyhow.
It was mostly the first run on the system that caused concern. (such as when It was moved from test to production)
The initial run may be 5,000 records. Following runs are only 200-400 records. Even still breaking it into smaller bits and running it overnight works just fine for me in this instance.
But you never know. Your response may help someone else down the line.