Query a date range in weeks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-14-2012 03:53 PM
I have a table with years worth of data that I want to organize in another table. I'm trying to find a way where I can run a scheduled job that loops through an entire week for the past 51 weeks and runs some data in the query.
Here's what i've tried (that sadly doesn't work):
for (var weekCount = 1; weekCount < 52; weekCount++) { variable = retrieveCount(weekCount); } function retrieveCount(weekCount){ var tuc = new GlideRecord('table'); tuc.addQuery('sys_created_on', '>=', gs.weeksAgoStart(weekCount)); tuc.addQuery('sys_created_on', '<', gs.weeksAgoEnd(weekCount)); tuc.query(); while (tuc.next()){ //do stuff } }
Does anyone have any idea what I could use instead of weeksAgoStart / weeksAgoEnd to cycle through 52 weeks?
Thanks!

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-20-2012 09:57 AM
I wouldn't say that the way you are attempting to do this is necessarily inefficient or dumb at all. However, there is always more than one way to do something.
I am curious as to what the array problems are, though. If your script is good then you shouldn't find any issues with having a large array besides a potentially slow server (especially if it starts to cache to disk). I'm happy to have a look at your script if you want - by all means PM me. I can't imagine you have a huge amount of data in your array though, surely it will look something like this: array(array('user', 'table', 'week_start', 'inserts', 'updates') [,...])? If you estimate the actual memory size of this, it really is not very big if you have 1 'record' per user.
Personally, I would focus on getting a function (or function set) together that will correctly parse 1 weeks worth of data. Looping through by weeks for the last year is secondary and a trivial modification to this really.
I can see why your next thought is to process by each user, but I don't think that should be necessary.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-16-2021 06:10 AM
Hope this helps
Some tips about it here below
Description
Servicenow always recommends exporting data in chunks in case of large data set with millions of records to avoid any performance implications and impact on other services running on the system. The Procedure below will explain how to Export Bulk Data from Servicenow via REST Web Service Pagination.
Procedure
By default, ServiceNow has the max limit of 10000 records which will be returned in any rest call and is being caused by the omission of the parameter sysparm_limit which default value is 10000. If you specify a higher value in the URL then you can get the desired amount of records.
Example: /api/now/table/incident?sysparm_limit=40000
The best practice to retrieve large amounts of records is to by using pagination. That is, get sub-sets of records in different calls, for example, first, you get records from 1 to 10000 in a call, then you get from 10001 to 20000, then from 20001 to 30000, etc.
To do so, you can make use of the parameters
- sysparm_limit: to define how many records you get in one call (10000 by default)
- sysparm_offset: to define the records to exclude from the query
- sysparm_query=ORDERBYsys_created_on: to sort the list of records based on the created date/time
In this case, the first call would be something like
/api/now/table/incident?sysparm_limit=10000&sysparm_offset=0&sysparm_query=ORDERBYsys_created_on
the second one
/api/now/table/incident?sysparm_limit=10000&sysparm_offset=10000&sysparm_query=ORDERBYsys_created_on
the third one
/api/now/table/incident?sysparm_limit=10000&sysparm_offset=20000&sysparm_query=ORDERBYsys_created_on
.
. and so on...
You can put that in a loop to get all the records using pagination until all the records are read.