Event Processing
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-14-2012 03:10 PM
When we fire events gs.eventQueue() is there a way to remove duplicates. For example - If we have an event that fires every time a work_note is added to an Incident, and 20 work notes are added within a few seconds, this event would fire 20 times. Is there a way to set up the event queue so that only the last event is processed and the first 19 are ignored/deleted. This would be consistent with the logic on the email table to prevent duplicate emails from being sent.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-12-2014 02:25 PM
Err - my first thought would be a scheduled job that builds a GlideRecord Query for the repeat jobs, then processes through to delete the duplicates from the event table.
That said I would think there would have to be a more elegant solution - possibly having the business rule or script that puts an event in the queue to check first (again via GlideRecord) to search for similar events in the event queue table based on Day and incident number?
Looking at the Wiki though it seems like this is something addressed in the near future with Event Management...
Event Management - ServiceNow Wiki
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎10-15-2017 04:40 PM
An onBefore business rule within the event table sysevent which enforces only 1 pending of processing event (ready or processing states) will do the trick.
Thanks,
Berny
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎10-16-2017 12:20 AM
Why would the event be considered duplicate? Are you adding same work notes 20 times? If not, then event is not duplicate, even if they look duplicate.