Creating tasks in series such that completion of one triggers creation of another

Dennis R
Tera Guru

Hey all, I've been beating my head up against the wall in trying to implement this for several days, and I'm convinced that it's because I'm trying to go up against a best practice. I'm hoping I can get some thoughts on the best way to implement this.

We have a need to create a process workflow around decommissioning servers from our datacenter. The process has around 15 steps that have to be executed in a specific sequence.

For example, the process starts off with a request to our enterprise monitoring team to disable monitoring on a server that is being decommissioned. The next step is for the network team to take the server out of the load balancer. Those steps have to happen in that specific order, because if the server is taken out of the load balancer prior to the monitoring team taking it out of monitoring, they'll get alerts that transactions aren't being processed by the server that is being decommissioned.

We already have several catalog items that create requested item tickets to the respective group that needs to handle each part of the process. What I need is an overarching workflow to handle the sequence that these requested items need to be created in, along with change requests that need to be submitted as the process reaches certain phases, notifications that need to be sent out, and so on.

But I'm hitting a brick wall in how to implement this.

Our first inclination was to just create request tasks. The problem with that is that as I mentioned above, most of these steps already exist as requested items that are created by catalog items. It seems extraordinarily inefficient to me to have, for example, seven requested items submitted via the catalog item to disable monitoring, and four request tasks submitted via the decommission process. Reporting would quickly become a nightmare, and teams would undoubtedly get confused over why they're having to process these requests as two different ticket types.

Okay, so my next idea was that the server decommission process should be stored as a request record (not requested item). The request should have a workflow attached to it that drives creation of the requested items for each team as they're needed. I cobbled together a record producer to create the request. I don't want to use an order guide because, like I said above, there are around 15 different things that need to happen. I don't want a user to have to sit there and tab through 15 catalog item screens when about the only thing that's actually required is a server name, location, and IP address (which can almost always be pulled from the CI data, but I digress...). Also, I don't see any way for an order guide to implement the sequential nature of the request that we need. You can't, for example, have an order guide wait to create the request to run a full system backup until after the request to stop the services is complete.

I definitely want the tasks to be created sequentially. If I create them all up front, I run into three issues: 1) Teams will get confused because there will be requests out there in their queue that they're not supposed to work on, 2) reporting becomes a nightmare because a lot of them are based on when tasks are created, and 3) it seems a bit crazy to me to create all the tasks up front when, if along the way, something changes and the decommission request has to be cancelled, meaning that all pre-generated tasks would have to be cancelled as well.

So anyway, I created a record producer and modified the out-of-the-box request workflow to kick off a sub-workflow that I had planned on creating the individual requested item tasks and managing what gets kicked off and when. But then I read this in the documentation:

Note: If a workflow contains a Create Task activity that has executed on the current record, additional Create Task activities in the workflow or in subflows cannot create new task records. To create additional tasks in this situation, add a Run Script workflow activity activity that creates a task with a script.

@#$*!, seriously?

Okay, so I create some Run Script activities to create the request items. Problem is, unlike the Create Task activity, I can't get the things to run serially doing it this way. Once the task is created by the script, it just keeps blowing right on through the workflow.

I've tried a couple of options to implement waiting until one task completes until another is created. My first thought was to insert a Wait for WF Event activity, and create a business rule on the requested item table to post an "I'm done!" event back to the workflow. That works, but the problem is that if I have two requested items running simultaneously (such as a request item to disable a network switch port and a request item to deprovision storage that aren't dependent on each other), how do I know which task completed? The Wait for WF Event activity doesn't take any parameters, so there's no way to differentiate one "I'm done!" message from another without creating a unique event message for each step in the process. Of course, then you run into issues where the business rule that fires when a requested item is closed has to determine which specific message should be sent, probably in a giant switch or if-else-else-... statement, which is less than idea.

So then I came up with the idea of storing the running tasks in a variable associated with the workflow. When a task completes, the business rule would check all contexts to see if there's a variable that contains that requested item in the running items variable, and remove it. Then maybe I could use that in a Wait for Condition activity to allow the workflow to proceed. (Maybe? Frankly, I don't even know if that would work, if the condition would be checked on a workflow variable being changed, but I'm grasping at straws here.) But then that idea came to a halt because I can't seem to access workflow variables from business rules, and I can't even update them within the....


I'm quickly running out of ideas, and I can't seem to find any documentation on how to implement a serialized task workflow. At this point, I'm contemplating even more esoteric solutions like creating my own workflow activity that duplicates the Create Task functionality, but that leaves off the existing task check. But we're getting into the realm of stuff that's really hard to implement and nigh impossible to maintain.

I can't believe that this is such an odd requirement that someone hasn't run across it and solved it before.

Anyone have any advice or help I can use to make this happen? Am I just missing something obvious, a best practice that I'm trying diligently to work around and should be?

Help!

9 REPLIES 9

randrews
Tera Guru

ok a LOT of this depends on how you have the items built out for the individual process...



there are a couple of possibilities and all involve using a catalog item which is realistically your best bet... and understand server decom is the most complicate workflow in our system right now...



first if the items are built in such a way that you can run them as a subflow and they will work within decom.. . setup the variables in the decom workflow to pass into the items.. then SEQUENTIALLY call them in the order you need by using the workflow block in the decom workflow.. the workflow will stop as each item is called and not proceed until the child workflow finishes....


second.. create subflows for each process you will need to call and call them sequentially


third.. <and this is the path we went down> create a brand new workflow just for decom.. and create the tasks in it... personally i like this method since it sets all of your approvals etc in the decom workflow...



let me know if that helps at all...



as far as reporting goes.. USE STAGES in the workflow.. add some custom ones to this workflow



so for example our decom workflow has...



predecommisioning   <we turn off backups and alerts.. run new backups etc>


pending power off           <ready to turn the server off.. but haven't done so yet>


wait period       <we leave the server installed and powered down for a set period of time so the bu's can scream if an app goes down that was unanticipated or not communicated>


Decommisioning stage     <remove all firewall rules.. nats dns settings etc and tear down the physical/virtual server>


post decommisioning stage   <clean up of cmdb/rack for the server>


So just to wrap my brain around these suggestions...



The first and second sound like you're saying that if I have a server decomm workflow that's attached to the request, then I should create a request item (for example) to disable the monitoring, and including the workflow of the disable monitoring workflow as a subworkflow on the request's workflow? Is that possible in a practical way, attaching a subworkflow to a request item?



This is where I get a bit frustrated:


<and this is the path we went down> create a brand new workflow just for decom.. and create the tasks in it... personally i like this method since it sets all of your approvals etc in the decom workflow...


That's what I've been trying to accomplish.   I have a Decomm workflow that attaches to the request generated by the record producer. And I do plan on adding all of the approvals to that workflow before the first task is kicked off.   But what I can't get to work is creating the tasks to execute serially after that.


i wouldn't create it as a request workflow.. but as a catalog item.... that way your normal catalog item workflows work exactly as designed no need to create anything special except the workflow for the catalog item... as long as you string the catalog tasks sequentially one won't trigger till the one before it is closed...



as far as including one workflow in another yes that is VERY possible and fairly easy to setup...



http://wiki.servicenow.com/index.php?title=Using_Subflows#gsc.tab=0



there are also optioons for creating new items and attaching to the same request.. i discarded that idea because of your need for the items to run sequentially... making it do so can be tricky if it isn't one item.



did that answer your questions?


as long as you string the catalog tasks sequentially one won't trigger till the one before it is closed...



That's part of my issue, though. I don't want to create these as catalog tasks, they're already in the system as separate catalog items. I suppose I could create the server decommission as a requested item instead of a request, but I don't know, that just didn't seem to make much sense to me, since there's a definite parent-child relationship there. Even if I do it that way, would it be easier to spawn serialized requested items from a requested item workflow than to spawn them from a request workflow? I guess I'm not really seeing the difference. (But I heavily emphasize that I don't say that with the spirit of, "You're wrong!" It's more in the spirit of, "I don't understand how these things fit together, and I want to use best practices as much as possible.")



I know how to include a subworkflow, I was just thinking that it would be a little weird in that normally the catalog item creates the workflow and attaches it to the requested item. In this case, I'd have the subworkflow statically plunked down on the parent workflow, and I'd have to manually attach it to the requested item myself. I don't know if there would be any issues with the requested item itself trying to create an attach its own workflow; I don't know off the top of my head when precisely the workflow context gets created and attached. I suppose I should go look that up...