Seeking best practice for table clean up - wf_history

SandySeegers
Giga Expert

I'm tinkering with the upgrade to Geneva in a sub-prod instance and working through some of the useful reports they now provide when you upgrade.   One of the reports is "Schema Changes to CLone Excluded Tables".   One of the tables in the list is wf_history.   I'm trying to minimize any "game-day" surprises so I checked the size of every table on the list.

It turns out that wf_history has a whopping 4.2 million records!   That can't be good for performance in general.   And I don't want to waste a lot of time changing that table when we upgrade.   After more poking around I see that most of the underlying workflow related tables are wallowing - since day one.  

So I'm looking wondering what other folks have done to keep this beast under control.   Will archiving the parents take care of these?   Maybe a sledge hammer approach with a few table cleaner rules?

I appreciate your thoughts, in advance.

Sandy

5 REPLIES 5

ghsrikanth
Tera Guru

You can add the table in the table cleaner list -


Introduction to Managing Data - ServiceNow Wiki


Section 3.3


https://<yourInstanceName>.service-now.com//sys_auto_flush_list.do



Hopefully it helps


We currently don't have any table cleaner rules set up for any of the workflow tables.  



I understand how to use the table cleaner.   I'm just curious how people are using it for workflow tables.   Do you have any rules set up for wf_transition_history, wf_history, wf_executing or wf_log?  



Thx


we currently have table cleaner rules setup to delete workflow contexts that are 18 months old and inactive.


Yes, its always better to set up the rules in the flush and set the time ago to which it will be deleted



Screen Shot 2016-03-29 at 8.30.08 PM.png