- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎07-10-2014 04:17 PM
I'm running into some performance issues using GlideRecord and GlideAggregate to perform the equivalent of a relatively simple SQL query. The specific problem I'm trying to solve is to determine which reports haven't been run in a certain amount of time (such as a year). My solution which utilizes a GlideAggregate can return the required data, but for scanning through an entire year's worth of report views requires about 30 minutes. When I use the gs.sql() command to perform a simple table join and filter, it takes 13 seconds. Unfortunately, I haven't been able to find a way to actually return any data from the SQL query (since it just posts the result in the Background Script screen). I'm not getting the performance I need out of the GlideAggregate object since it's not taking advantage of SQL's functionality, instead offloading a lot of the work into my Javascript code. Any suggestions would be appreciated! I would prefer to go the SQL route if there's some way to actually get data out of it.
GlideAggregate Code (this returns all reports which have been used within x days):
function usedReports(days) {
var used = [];
var lastRun = new GlideAggregate('report_view');
lastRun.addAggregate('MAX', 'viewed');
lastRun.groupBy('report');
lastRun.addQuery('viewed','>',gs.daysAgo(days));
lastRun.query();
while(lastRun.next()) {
used.push(lastRun.report.sys_id);
}
return used;
}
SQL Code (this shows all reports which haven't been run within x days):
function unusedReports(days) {
var date = gs.daysAgo(days);
gs.sql('SELECT A.title FROM sys_report A LEFT JOIN (SELECT * FROM report_view WHERE viewed>\'' + date + '\') B ON A.sys_id=B.report WHERE B.viewed IS NULL ORDER BY A.title;');
}
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎07-10-2014 04:34 PM
Have you considered putting a Business Rule on the report_view table? That way, on every update you can store the most recent value either on a custom field on the Reports table or in a custom table. This will cache your results and make your query near instantaneous by offloading the work to the business rule.
Edit: Another alternative to this is a scheduled job run daily that runs smaller incremental queries (reports viewed yesterday) to update the field or cache table. This could be a nice balance between slamming the db with updates via business rule and a 30 minute query on a massive table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎07-10-2014 04:34 PM
Have you considered putting a Business Rule on the report_view table? That way, on every update you can store the most recent value either on a custom field on the Reports table or in a custom table. This will cache your results and make your query near instantaneous by offloading the work to the business rule.
Edit: Another alternative to this is a scheduled job run daily that runs smaller incremental queries (reports viewed yesterday) to update the field or cache table. This could be a nice balance between slamming the db with updates via business rule and a 30 minute query on a massive table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎07-10-2014 04:41 PM
That could work, with a bit of pre-processing to deal with all of the pre-existing report_view data (a table which contains nearly 30 million records). I could probably toss that into a script and run it as a one-time thing. I figure this'll be my best option since it doesn't rely on any weird workarounds. I'll let you know how it goes!