Run async script include from UI Action

m_23
Giga Contributor

Hello,

Ran into a bit of a performance issue today and the cause seems to be a UI Action calling a long running script include, resulting in the UI Action timing out.  

We are working on a scoped application and would like to get the UI Action to process the script include asynchronously to avoid performance impact to the client session.   Any suggestions on how this can be achieved would be greatly appreciated!

Thanks!

Matt

1 ACCEPTED SOLUTION

I would suggest using an async business rule if it is an integration. UI action are mainly for instant actions. You should be able to do a glide ajax from UI action, if you mark the UI action client. But the best practice it to use an async business rule which will improve the performance significantly.



Please mark this response as correct or helpful if it assisted you with your question.

View solution in original post

8 REPLIES 8

Community Alums
Not applicable

You could take the code from your UI Action and put into a scheduled script job. Then the code in your UI Action could be changed to execute that scheduled script asynchronously:

var job = new GlideRecord("sysauto");
job.get("SYS_ID_OF_SCHEDULED_JOB");
SncTriggerSynchronizer.executeNow(job);
gs.addInfoMessage("Let user know what is happening.");
current.setRedirectURL(current); // Re-direct back to current page

Let me know if this helps, cheers.

Fabian Kunzke
Kilo Sage
Kilo Sage

Hi,

 

i may be a little bit late on this, but i have to disagree with the solution here. In my opinion a business rule should only be used, if there is a change to a record. Otherwise you will have to add random "execute this" flags all over you data model, which is a horrendous practice (even though i did this in the past as well).

I just recently had a use case, where a user needed to initate a rest call to import and update assembly line information stored in SAP. The interface did need the user name information as - highly simplified - a authentification token. I wanted to use an explicit user interaction, however this interaction should be asynchron due to the runtime of the interface.

To solve this issue you indeed could set a flag which in return triggers a async business rule, but again this would result in flagcity within your data model. Instead you can use an event.

Using events requires you to create a new event registry entry (which is fairly simple) and then define a script action. That script action is whats similar to a business rule. Essentially, when an event is fired, the script action executes to process said event. All you would have to do then is create the ui action and have it fire the specified event. With that the interaction triggered by the event is asynchron from the user interaction.

Downside to this, however is, that the event queueing does not run within the user context, however the current record can be kept. Therefore, you may have to adjust the script include logic.

Alternatively, if the user context is needed you could always trigger a workflow and/or flow from the ui action itself. Again, within the workflow you can call the script include function.

This gives you multiple different options of allowing the user to interact with the system without being dependent on a record update to trigger it. This especially is true in regards to triggering a scheduled job, which is exactly what the event registry is for instead.

I hope this helps and can guide you to a - in my opinion - cleaner approach to you task.

Regards

Fabian

I agree with Fabian.  I wouldn't use a business rule here until you are in-fact wanting the UI Action to change details of the record that would cause a business rule to fire.

If you don't need to pass variables (i.e. the id of the record) to the script, another option is to use a Scheduled Script Execution (sysauto_script) and fire that from the UI Action.  But more than likely you will need to pass variables, and should use an Event and Script Action.

Actually - and i may be late to the party here then - by now i dislike either of these. Neither are right or wrong, but it just shows how many ways of "getting there" exist by now. From the past year of experience I'd much rather trigger a flow execution.

Reasons:

Using events can work well, but you will need to take care of event handling.

Using a sysauto_script is counterintuitive to me, as this should only be a script exectued in a scheduled manner. Further - at least according to my best practices (i wrote an article about it) - scheduled jobs should only call business logic, not actually contain it. So it would - again, IMO - be only a workaround.

However, flow can lead to a bit of a system overhead compared to events, as their execution is left to the system. But as long as those are not executed 24/7 (at which point it should be a scheduled job i guess), a flow is fine, especially when considering an API context, where the integration hub could help out.

NOTE: These approaches are highly subjective and i personally lack the effort to prove performance or "upgradeability" claims. They work. But i personally do believe that beauty and separation of concerns do help with extendability and maintainability. In short: If i have an API (or any piece of business logic) which is triggered by a UI action, the last place i would check for errors would be a scheduled job.

Yes, it does work, but i would go as far as deeming it even worse for maintainability in compared to the business rule (which is more or less the place for triggering business logic).
With that said, nothing goes over a bit of documentation. Create a knowledge base article for the instance linked to the user story and the technical components and then it's down to personal preferences. And at the end, i would always be as low as possible on intrusion regarding the service now architecture. And with that i stand by my own reasoning with the "adding a field" approach.