- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
on 08-22-2021 09:21 PM
Introduction
A popular question here in the community is how to automatically import an Excel file from an incoming email into ServiceNow. As a possible answer often the article Loading data from an email attachment is referenced but the described pro-code solution approach therein represents an impassable challenge for many users. However, with the permanently enhanced Flow Designer, we have a tool available in ServiceNow that allows us to master even complex scenarios (almost) without code. For this reason, this article presents an automated solution for importing email attachments based on a Flow and is thus addressed to the Citizen Developer.
Prerequisites
To keep the flow as simple as possible, it relies on an existing Scheduled Import Set. That means you have a working Data Source of type "File" and source "Attachment" as well as a Transform Map which transforms the Excel columns to the respective fields of your target table. Test your Scheduled Import Set to make sure that it is really working as intended. Afterwards, inactivate the Scheduled Import Set so that further import processes are not started accidentally.
Solution
Create a Flow with an "Inbound Email" trigger to catch the email. After checking whether it has a certain Excel attachment the Excel file is copied to the regarding Data Source and the Scheduled Import Set is triggered to do the import job.
Overview
Detailed Steps
Step | Description |
Choose "Inbound Email" as trigger. As minimum condition I look for a certain subject but feel free to add additional conditions. Please note: | |
Load all email attachments into a list which can be used for iterating in the next step. | |
Iterate all email attachments. | |
Check whether the file name of the current attachment has the required value. Please note: | |
Lookup the Scheduled Import Set record by a given name. Please note: | |
Select via "dot-walking" the referenced Data Source for the "Delete Attachment" action. This ensures that there are no "file collisions" at the Data Source record. | |
Copy the Excel file from Email Attachments table to the Data Source record. | |
Activate the Scheduled Import Set. Unfortunately, Scheduled Import Sets do not offer a run type "At once". For this reason the Type "Daily" is set and the current time plus 20 seconds is chosen. This is the only place where some small source code comes into play, as the run time cannot be set via a configuration option. The following code is set as inline script (see video on Youtube for more information):
Please note: | |
Deactivate the Scheduled Import Set after waiting 30 seconds to give the system the necessary time for starting the Import Job in the background. |
Enhancements
Although the solution presented here works very well, the Flow should still be extended and hardened for use in a productive environment. Please find some enhancements I can imagine. It is not a complete list and can be extended by yourself if necessary:
- Check the email sender to allow only the authorized users triggering the import.
- If no email attachment is available, respond to the email sender with an email notification which points out the forgotten Excel file.
- Abort the Flow in case no Scheduled Import Set record could be found.
- Abort the Flow in case the Scheduled Import Set record has no referenced Data Source.
- Check whether the Scheduled Import Set is inactive and abort the Flow if not. In this way it is ensured that by many emails in succession an import is not started again and again (Please note: this is not a perfect approach. Instead, the status of the triggered background job should be checked, but this would require some additional pro-code parts).
- Inform the email sender at the end of the Flow that the import has been triggered successfully.
- 17,541 Views
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hello Maik!
This article is really helpful! I am running into one problem at the end on the 'Update Scheduled Data Import record'. I get a 'Scope does not have write access to table scheduled_import_set' error.
I have tried enabling access to all application scopes on that table with write permissions and still getting the error.
Any suggestions?
Thanks!
Elijah
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi
could it be that you accidentally have created the Flow in a scope other than "Global"?
Kind regards
Maik
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi,
The flow works only for admin and when end user or users with itil role sends the email it does not work. Is there a solution for it.
Thanks
Sowmya
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Try setting the flow to run as system instead of as the triggering user.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Maik,
My flow is not getting triggered even though everything is run by System Administrator.
Do we need to add something more in trigger condition?
Thanks,
Anubha Datey
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hello Maik,
Thank you for this flow process. I have half of it working. It is stuck at the Delete Attachment step with the following errors which I have no idea what they mean.
I have added a screenshot of the error log and the config of the flow at the delete step.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
I had problems getting this to work properly.
Excel file was attached to the Data Source Record BUT the Scheduled Import was not executing.
It would execute manually OK so that confirmed that the output table, data source and schedule import record were all configured properly.
Problem turned out to be the Wait 30 seconds in Step 8.
Once I changed that to Wait 2 minutes then everything ran OK.
The wait needs to be long enough for the Data Source to be transformed. If you have a BIG excel file you may have to extend that wait time. Change until it works and then allow a little margin.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Thanks for that valueable feedback!

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
This helped me solve a major milestone for my employer. Thank you very much @Maik Skoddow!
I'd like to stress the importance of the timer inside the script in step 8 that @kshaw noted. I too had to increase the time here (I went with 2 minutes as well), and the files I am dealing with are all very small--1 header row, 1 data row and about 10-12 columns. Yet I still had issues where the scheduled import wasn't running.
In my scenario, I am also pulling the data from the staging table back into the flow to feed into a catalog item submission action, so added another timer to ensure attachments process properly.
Mainly though wanted to share my appreciation to both of you for the help. 🙂
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Thanks for your feedback @apjohn2 !
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Maik,
Thankyou, this is very helpful to us.
I have a question, in the schedule job we used some time frame to run that job using flow. But is possible to run the UI action in the scheduled job "execute now" by using this flow ?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Maik,
thanks for your post. We have been re-using the concepts you presented and it works very well.
We're trying to expend this and have been introducing a record producer which allow an admin user on our customer side to load an excel and process it.
Situation is this:
We have different types of data within an excel file, all in different sheets. (so let say sheet1, sheet2,...)
We have been using a few variables which to determine which staging table and transform map should be used.
We tried using the same logic as in the load data interface which asks which sheet number should be loaded. That's the only part that doesn't work. It's consistently loading sheet1.
We have tried a few things but can't really figure out how to make it work. I tried looking around but couldn't find any doc on the glideimportsetloader class.
Any idea?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Maik,
This solution is not compatible with Scoped Applications?
Thanks,
Tamizh