setting journal field by script: activity records are batched and out of order

half_baked1
Kilo Contributor

I'm in the process of importing data from one ticketing system to SNC. The data is flat (excel) and small in scope.

The comments from the source system are relational, so I'm querying the comments by the parent ticket, and sorting by date. When I gs.log the comments, they're all in order. For each iteration, I also update the journal field (incident.work_notes = comments) and then I update the incident record each time.

the problem I'm seeing is that when it creates the activity record, it's batching my updates together with 10 or so inserted comments per activity record. And, within each activity record, my records are not in order. (see screenshot attached).

Here is snippet where I set work_notes. gr is recordset of comments I'm importing. cr is the incident record i'm updating. The records in recordset gr are ordered by date, and the gs.log entries are in correct order, but the comments are batched and not in order in the activity log.


while (gr.next()) {

var c = '';
c += '[Import PRIVATE: ';
c += gr.u_createdby_name + ', ';
c += gr.u_createddate + '] ';
c += gr.u_commentbody;

gs.log(c);
//Now set the work_notes with the formatted string
cr.work_notes = c;
cr.update();


In this case, I do not need the activity time stamp to match the imported comment timestamp: I just need the comments to be added to worknotes so they show up in order. From my debug logs, I'm looping through records in correct order, that order is just not reflected in the activity log. It's almost like journal field updates get queued up somewhere and processed and maybe not in order.

Has anyone else experienced this and possibly have a workaround?

I don't care to do detailed updates on underlying journal and history records, since that's more work than appropriate for the one time load.

(note: for screenshot below, I'm inserting the imported timestamp as text)

4 REPLIES 4

aray
Giga Contributor

Haven't had this experience exactly, but one idea for a workaround might be to build up a temporary variable, and then do a single write at the end.

We do something like this in an application we built, and we use HTML to format things nicely and make the source record hyperlinkable.

var preface = '


<i>Applied Rule: </i>';
var ruletext = '(' + rule.u_order + ') ' + rule.u_rule_type + ':' + rule.u_name;
var rulelink = 'u_event_rules.do?sys_id=' + rule.sys_id;
// must use a temporary history variable, so we only update current.u_history one time, at the end of applyrules
this.temp_history += preface + '<a href="' + rulelink + '">' + ruletext + '</a>
\n';


May or may not work for your purpose.


half_baked1
Kilo Contributor

Yes, that was my first choice (to concatenate multiple comments and insert). However, we have some [older] tickets we are importing that have a high number of comments. My concern was hitting upper field limit size, but also having some records batched in activity records, making it hard to read. Also, I think I would have problems with interleaving comments and worknotes (I batch together three comments at 1p, 2p, 3p; but then have a batch of work notes at 1:30p, 2:30p, 3:30p: in that case, if you were reviewing ticket you'd have to jump back and forth between to activity records to get the timeline of events).

What I *think* is happening, is that I'm doing 1-10 updates per second to a journal field (updating the incident each time though). And there must be some async process which builds the activity listing and groups on time. Since I'm sub-second in my updates, I'm getting 1-10 comments or work_notes per activity entry.

What I did for now to address it is to run gs.sleep(1000) to delay for 1 second in between each journal field update. It's not the prettiest solution, and that means I'm only importing 3600 comments/work_notes per hour as part of the one-time import, but it works and correctly sorts comments/work_notes as single activity records for proper interleaving.

I'll expect a ticket from the locked semaphore police any time now... 🙂


I'm having the exact same issue.


Inserting into both the journal table and sys_audit.


I added a 5 second delay but they are still appearing completely jumbled.



Did you ever find a solution?


If you dont mind using c# and webservices, here is one way to do it:


How to import legacy incidents, comments and worknotes via c# & web services



Cheers,


- James