Maik Skoddow
Tera Patron
Tera Patron

find_real_file.png

 

Introduction

A popular question here in the community is how to automatically import an Excel file from an incoming email into ServiceNow. As a possible answer often the article Loading data from an email attachment is referenced but the described pro-code solution approach therein represents an impassable challenge for many users. However, with the permanently enhanced Flow Designer, we have a tool available in ServiceNow that allows us to master even complex scenarios (almost) without code.  For this reason, this article presents an automated solution for importing email attachments based on a Flow and is thus addressed to the Citizen Developer.

 

Prerequisites

To keep the flow as simple as possible, it relies on an existing Scheduled Import Set. That means you have a working Data Source of type "File" and source "Attachment" as well as a Transform Map which transforms the Excel columns to the respective fields of your target table. Test your Scheduled Import Set to make sure that it is really working as intended. Afterwards, inactivate the Scheduled Import Set so that further import processes are not started accidentally.

 

Solution

Create a Flow with an "Inbound Email" trigger to catch the email. After checking whether it has a certain Excel attachment the Excel file is copied to the regarding Data Source and the Scheduled Import Set is triggered to do the import job.

 

Overview

 

find_real_file.png

 

Detailed Steps

 

StepDescription
find_real_file.png

Choose "Inbound Email" as trigger.

As minimum condition I look for a certain subject but feel free to add additional conditions.

Please note:
Inbound Email Flows take precedence over Inbound Email Actions!

find_real_file.png

Load all email attachments into a list which can be used for iterating in the next step.

find_real_file.png

Iterate all email attachments.

find_real_file.png

Check whether the file name of the current attachment has the required value.

Please note:
At field "condition 1" you can see a "f(x)" icon. I selected there the inline function "To Lower Case" to make sure that the filename comparison is as robust as possible.  

find_real_file.png

Lookup the Scheduled Import Set record by a given name. 

Please note:
I prefer this approach to searching for a Sys ID, because this way the flow works equally in all instances.

find_real_file.pngSelect via "dot-walking" the referenced Data Source for the "Delete Attachment" action. This ensures that there are no "file collisions" at the Data Source record.
find_real_file.pngCopy the Excel file from Email Attachments table to the Data Source record. 
find_real_file.png

Activate the Scheduled Import Set. Unfortunately, Scheduled Import Sets do not offer a run type "At once". For this reason the Type "Daily" is set and the current time plus 20 seconds is chosen.

This is the only place where some small source code comes into play, as the run time cannot be set via a configuration option. The following code is set as inline script (see video on Youtube for more information):

var gdtRunTime = new GlideDateTime();
var gtAdd = new GlideTime(
              gdtRunTime.getNumericValue() + 
              gdtRunTime.getDSTOffset() + 
              20000
            );

return gtAdd;

Please note:
The type of the field "Time (run_time)" is "glide_time". This is the simplified version of a GlideDateTime which unfortunately doesn't have knowledge about time zones or time offsets due to winter time. For this reason the winter time offset is added manually before adding 20 seconds (20000 milliseconds).

find_real_file.pngDeactivate the Scheduled Import Set after waiting 30 seconds to give the system the necessary time for starting the Import Job in the background.

 

Enhancements

Although the solution presented here works very well, the Flow should still be extended and hardened for use in a productive environment. Please find some enhancements I can imagine. It is not a complete list and can be extended by yourself if necessary:

  • Check the email sender to allow only the authorized users triggering the import.
  • If no email attachment is available, respond to the email sender with an email notification which points out the forgotten Excel file.
  • Abort the Flow in case no Scheduled Import Set record could be found.
  • Abort the Flow in case the Scheduled Import Set record has no referenced Data Source.
  • Check whether the Scheduled Import Set is inactive and abort the Flow if not. In this way it is ensured that by many emails in succession an import is not started again and again (Please note: this is not a perfect approach. Instead, the status of the triggered background job should be checked, but this would require some additional pro-code parts).
  • Inform the email sender at the end of the Flow that the import has been triggered successfully.

 

Comments
Elijah Blake
Kilo Contributor

Hello Maik!

This article is really helpful! I am running into one problem at the end on the 'Update Scheduled Data Import record'. I get a 'Scope does not have write access to table scheduled_import_set' error. 

I have tried enabling access to all application scopes on that table with write permissions and still getting the error.

Any suggestions?

 

Thanks!

Elijah

Maik Skoddow
Tera Patron
Tera Patron

Hi

could it be that you accidentally have created the Flow in a scope other than "Global"?

Kind regards
Maik

Sowmya39
Tera Contributor

Hi,

 

The flow works only for admin and when end user or users with itil role sends the email it does not work. Is there a solution for it.

 

Thanks

Sowmya

Chase Long
Tera Explorer

Try setting the flow to run as system instead of as the triggering user.

Community Alums
Not applicable

Hi Maik,

My flow is not getting triggered even though everything is run by System Administrator.

Do we need to add something more in trigger condition?

Thanks,

Anubha Datey

kshaw
Giga Guru

Hello Maik,

Thank you for this flow process. I have half of it working. It is stuck at the Delete Attachment step with the following errors which I have no idea what they mean.

I have added a screenshot of the error log and the config of the flow at the delete step.

2022-06-14 01:31:46ErrorFlow Designer: Operation(6cbdb84e1b8451107d265393604bcb14.If$1.24bdb84e1b8451107d265393604bcb18.70bdb84e1b8451107d265393604bcb3e.0ec2a0170f973300dfd3694e68767e8b) failed with error: com.snc.process_flow.exception.OpException: Error: Unable to read the Record (sys_script_include.2ef3f17c9313130079b5925cf67ffb95.script; line 19) at com.snc.process_flow.operation.script.ScriptOperationBase.handleSc...
 
2022-06-14 01:31:46ErrorJavaScript evaluation error on: (function execute(inputs, outputs) { outputs = new global.AttachmentActionsUtil().getAttachmentIds(inputs,outputs); new global.AttachmentActionsUtil().deleteAttachments(outputs.AllIds); })(inputs, outputs); : org.mozilla.javascript.JavaScriptException: Read Error: Unable to read the Record (sys_script_include.2ef3f17c9313130079b5925cf67ffb95.script; line 19): ...
 
2022-06-14 01:31:46ErrorRoot cause of JavaScriptException: java.lang.String JavaScript evaluation error on: (function execute(inputs, outputs) { outputs = new global.AttachmentActionsUtil().getAttachmentIds(inputs,outputs); new global.AttachmentActionsUtil().deleteAttachments(outputs.AllIds); })(inputs, outputs);
kshaw
Giga Guru

I had problems getting this to work properly.

Excel file was attached to the Data Source Record BUT the Scheduled Import was not executing.

It would execute manually OK so that confirmed that the output table, data source and schedule import record were all configured properly.

Problem turned out to be the Wait 30 seconds in Step 8.

Once I changed that to Wait 2 minutes then everything ran OK.

The wait needs to be long enough for the Data Source to be transformed. If you have a BIG excel file you may have to extend that wait time. Change until it works and then allow a little margin.

Maik Skoddow
Tera Patron
Tera Patron

Thanks for that valueable feedback!

apjohn2
Mega Sage

This helped me solve a major milestone for my employer. Thank you very much @Maik Skoddow!

 

I'd like to stress the importance of the timer inside the script in step 8 that @kshaw noted. I too had to increase the time here (I went with 2 minutes as well), and the files I am dealing with are all very small--1 header row, 1 data row and about 10-12 columns. Yet I still had issues where the scheduled import wasn't running.

 

In my scenario, I am also pulling the data from the staging table back into the flow to feed into a catalog item submission action, so added another timer to ensure attachments process properly.

 

Mainly though wanted to share my appreciation to both of you for the help. 🙂

Maik Skoddow
Tera Patron
Tera Patron

Thanks for your feedback @apjohn2 !

sekharksnow
Tera Contributor

Hi Maik,

 Thankyou, this is very helpful to us.

 I have a question, in the schedule job we used some time frame to run that job using flow. But is possible to run the UI action in the scheduled job "execute now" by using this flow ?

Julien
Tera Guru

Hi Maik,

thanks for your post. We have been re-using the concepts you presented and it works very well.

We're trying to expend this and have been introducing a record producer which allow an admin user on our customer side to load an excel and process it.

Situation is this:

We have different types of data within an excel file, all in different sheets. (so let say sheet1, sheet2,...)

We have been using a few variables which to determine which staging table and transform map should be used. 

We tried using the same logic as in the load data interface which asks which sheet number should be loaded. That's the only part that doesn't work. It's consistently loading sheet1.

We have tried a few things but can't really figure out how to make it work. I tried looking around but couldn't find any doc on the glideimportsetloader class.

Any idea?

Tamizh Selvan
Tera Explorer

Hi Maik,

 

This solution is not compatible with Scoped Applications?

 

Thanks,

Tamizh

Version history
Last update:
‎08-22-2021 09:21 PM
Updated by: