import set data getting deleted immediately
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-26-2023 07:12 AM
Hello ,
I have a staging table "x_caci_cisco_dna_cisco_dna_inventory_det" which is part of the scoped application Cisco DNA available on App store .
On this staging table there is a field called "u_device_details" which stores the JASON data coming from DNA .
When i post the data from Postmen on the staging table via webservice the transform map process the data and push them into main table .
The data from the staging table is lost immediately once the records are process which i don't want to happen. I need the JASON data to be stored on the staging table .
On the transform map there are two scripts order of 100 and 200 as attached.
I did investigated form my side and came across Scheduled Script Execution > Import Set Deleter but this script delete the data after 7 days.
How can i stop the data getting deleted from the staging form ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-26-2023 07:46 AM
Hi @Amit31 ,
Hope you are doing well
Ideally this should not happen unless the system is configured in such a way.
Could you please check if any transform map script, BR is doing the same ?
Please mark this response as correct or helpful if it assisted you with your question.
Regards,
Harshal
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-26-2023 09:25 AM - edited 07-26-2023 09:27 AM
Hello @Amit31 ,
To stop the data from getting deleted from the staging table in ServiceNow, you'll need to modify the behavior of the 'Import Set Deleter' scheduled job. The Import Set Deleter is responsible for clearing out the data from the staging table after it has been processed. By default, it retains the data for 7 days before deletion.
There are 3 ways to achieve this:
- To prevent the script from deleting the records, you can comment out the code responsible for deleting records in the scheduled job and respective Script Include. This way, the data will not be removed from the staging table after processing.[**Harmful for instance health**]
- If you want to retain the records for a longer duration but still need some cleanup, you can modify the script to delete only those records that are older than a certain date, like 30 days or more.
- (** It is just my thought, not sure possible or not...) Update the scheduled job and Script Include in such a way that, "it should skip your staging table x_caci_cisco_dna_cisco_dna_inventory_det and work same for the rest".
Please note that modifying scripts in ServiceNow can have significant impacts on the system's behavior, so make sure to test thoroughly in a non-production environment before implementing changes in your production instance.
If this helped you in any way, please hit the like button/mark it helpful. Also, don't forget to accept it as a solution. So it will help others to get the correct solution.
thanks,
Prasad