Duplicate records in parallel Rest API
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-29-2022 02:26 AM
Hi
I have Scripted Rest API that creates task records from another applications.
We have a unique key (the field in the task table is unique = true) .
If there are several requests from the APis with the same unique key there are more then one record creates with the same unique key.
We have a validation to check if the unique key exist but because the requests are parallel this validation doesn't work.
I'll be happy to find new way to solve this issue .
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-30-2022 03:27 AM
Consider using the Import Set API instead of a scripted Rest API. You can load the data to an import table, and have an associated transform map (or Robust Transformer) take care of loading the data to your target table. The transform map can use your unique key as coalesce and can simply ignore duplicates.
Check this great post by @Kieran Anson for further details and examples.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-31-2022 11:00 PM
Thank you for the answer. I'm note sure that import set will be affective here.
I need to send the response with the new incident's sys_id.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-01-2023 05:34 AM
You can actually use an onComplete Transform Script to add any value you want to the response object, including the sys_id of the newly created record. You could also add an onBefore transform script to add a custom error message in case there is already a match with your unique key and thereby nothing is inserted. Check the Custom Response Message section of Kieran's post I've shared before for some examples.