Automation of data upload to tables via import set
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2023 01:41 PM
Hi All,
We have a requirement, where customer will share the input excel file in the format that I ask and I need to upload that data into time_card table.
We have shared the template and they are sharing the data in that format.
I have created Staging table and transform maps (as per requirement) and I want to use the same for each import.
So I do not have to create staging table / transform map each time.
I want to automate the following, as following needs to be done every friday night
1. Getting the template every week from customer (maybe via email)
2. Uploading it into existing staging table custom_staging table
3. moving data from custom_staging table to time_card table using existing custom_transform map
4. Sharing the import report to customer via email
Please help me implementing this.
Thanks,
Talya
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2023 11:38 PM
var EmailFileImportUtils = Class.create();
EmailFileImportUtils.prototype = {
initialize: function() {
},
scheduleImport: function(dataSourceID, transformMapIDs) {
/*
* Create scheduled job to process import
*
* The inbound email action will generate an import data source, however the attachment isn't copied to the data source until
* after the insert of the record. The code below will create a scheduled job to process the import 30 seconds later
* so that attachment has time to be copied to the data source from the email.
*/
var schRec = new GlideRecord("sys_trigger");
schRec.name = "Load Data Source: " + dataSourceID;
schRec.trigger_type = 0; // Run Once
schRec.script = "new global.EmailFileImportUtils().loadImportSet('" + dataSourceID + "', '" + transformMapIDs + "')";
var nextAction = new GlideDateTime();
nextAction.addSeconds(30); // 30 seconds should be enough time however this can be changed.
schRec.next_action = nextAction;
schRec.insert();
},
loadImportSet: function(dataSourceID, transformMapIDs) {
// Get Datasource Record
var dataSource = new GlideRecord("sys_data_source");
dataSource.get(dataSourceID);
// If CSV and header isn't on row 1, recreate attachment with empty rows removed, and delete original
if (dataSource.getValue("format") == "CSV" && dataSource.getValue("header_row") > 1) {
var attachmentRec = new GlideRecord("sys_attachment");
attachmentRec.addQuery("table_sys_id", dataSource.getValue("sys_id"));
attachmentRec.query();
if (attachmentRec.next()) {
var oldAttachmentID = attachmentRec.getValue("sys_id");
var inputStream = new GlideSysAttachment().getContentStream(oldAttachmentID);
var textReader = new GlideTextReader(inputStream);
var ln = " ";
var newLine = "";
var lineCounter = 0;
var headerRow = parseInt(dataSource.getValue("header_row"));
while((ln = textReader.readLine()) != null) {
lineCounter += 1;
if (lineCounter < headerRow) {
continue;
}
if (ln.length > 1) {
newLine += ln + "\n";
}
}
new GlideSysAttachment().write(dataSource, 'Changed ' + attachmentRec.getValue('file_name'), "text/csv", newLine);
//Delete original email attachment so that the new one is processed. If two remain, API may get confused on which one to process.
attachmentRec.deleteRecord();
}
}
// Process data source file
var loader = new GlideImportSetLoader();
var importSetRec = loader.getImportSetGr(dataSource);
var ranload = loader.loadImportSetTable(importSetRec, dataSource);
importSetRec.state = "loaded";
importSetRec.update();
// Transform import set
var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapIDs);
transformWorker.setBackground(true);
transformWorker.start();
},
type: 'EmailFileImportUtils'
};
@Talya create a Inbound email action.
Set up prerequisites to load data from an email attachment
- You must establish an import file/template that will always be used.
- The columns in the spreadsheet must remain the same since it will require changes in ServiceNow to add, change, or remove columns.
- Your email needs to contain something unique to look for in order to know you want to process the email attachment.
- In other words you don't want to be trying to import every email attachment that is sent to your ServiceNow instances. Options are keywords in the subject or body of the email or even emails from a specific email address. Again you need something that will be unique about the emails for the inbound email action to look for.
- You will need to set up the import set table and transform maps.
- This can be done by manually importing the template as an administrator. Verify the import generated records in your target table and everything looks good. This blog isn't going to cover those steps, but once you can manually import the file, then you can automate that process.
- Visit the useful ServiceNow Documentation on import sets, import sets key concepts, transform maps, file type data sources, and importing data using import sets.
- You will need to write down or copy/paste a few things once this is set up for use in a script provided in this post.
- Name of your import set table - You can get this by refreshing your browser so the application navigator is updated
- Navigate to System Import Sets > Import Set Tables and there you should see a table that matches what you called your import when loading in the file
- Click the module that matches your table name and when the list shows, click any of the 3-line icons beside the column headers, then Configure, and choose table.
- When the table record shows up, copy down the name value or better yet you can copy the name value and paste it into a temporary text file.
- SysID of your transform map(s). This is the transform map that processes data in your import set table and drops it into your target table.
- Navigate to System Import Sets > Administration > Transform Maps and there you should see a record that matches what you typed in when manually importing your file.
- Right-click on that row and choose Copy sys_id
- Depending on your browser it may just copy that value into memory and you will need to paste it into a text file to see the value. Paste it into the temporary text file you used in the prior step.
- If multiple transform maps need to be leveraged, repeat the steps above to capture the additional SysIDs of the transform maps.
- Name of your import set table - You can get this by refreshing your browser so the application navigator is updated
- This can be done by manually importing the template as an administrator. Verify the import generated records in your target table and everything looks good. This blog isn't going to cover those steps, but once you can manually import the file, then you can automate that process.
Automate the processing of the inbound email with the attachment
Now that you have your email requirements established and your file set up for import, we can now automate the processing of the inbound email with the attachment. This will involve creating an inbound email action. To better understand how this works, look over the documentation on inbound email actions, inbound email action variables, creating inbound email actions, inbound email action examples, and inbound email action ordering.
Set up your inbound email action
- Navigate to System Policy > Email > Inbound Actions
- Click New.
- Set the following values:
- Name: Give it a name that makes sense to you.
- Set the Target table to Data Source (sys_data_source). This is because we expect these emails to contain an Excel or CSV file and we need to generate a data source with that attachment that can then be imported.
- Set Active to true
- Set Stop processing to true since we don't want any other inbound email actions to process this email or file.
- In the When to run section/tab:
- You may consider changing the order to a very low or negative number so that other inbound actions don't process these emails.
- If you are expecting these emails to come from a specific email, you can select the From user.
- Set the condition based on Pre-req 2 above. Examples are subject contains "file import" or something. Again this needs to be something unique but something that will always appear in these inbound emails.
- In the Actions section/tab:
- Paste in the following script:
(function runAction(/*GlideRecord*/ current, /*GlideRecord*/ event, /*EmailWrapper*/ email, /*ScopedEmailLogger*/ logger, /*EmailClassifier*/ classifier) {
var importSetTableName = "IMPORT SET TABLE NAME";
var transformMapIDs = "SYS-ID(s) OF YOUR TRANSFORM MAP TO UTILIZE"; //Use a comma to specify multiple transform maps
var applicatonScope = "Global";
// Create the datasource record
current.name = "File import from: " + email.from; //Feel free to rename this as appropriate
current.import_set_table_name = importSetTableName;
current.file_retrieval_method = "Attachment";
current.type = "File";
current.format = "Excel"; // For Excel Files
//current.format = "CSV"; // For CSV Files
current.header_row = 1;
current.sheet_number = 1;
current.sys_package.setDisplayValue(applicatonScope);
current.sys_scope.setDisplayValue(applicatonScope);
var dataSourceID = current.insert();
/*
* Schedule Load of Attachment
*
* This inbound email action will generate an import data source, however the attachment isn't copied to the data source until
* after the insert of the record. Scheduling the import to happen 30 seconds later so that attachment has time to be copied.
*/
new global.EmailFileImportUtils().scheduleImport(dataSourceID, transformMapIDs);
})(current, event, email, logger, classifier);
- Set the values of the variables declared in lines 3 and 4 of the script to what you captured in pre-req 3 above.
- You can specify multiple Transform Maps by separating them by a comma with no spaces on line 4.
- If your file is in CSV format, comment line 12 and uncomment line 13.
- If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
- Click Submit.
Set up your utility script include
Now we need to create the utility script include that is called by the inbound email action.
- Navigate to System UI > Script Includes
- Click New.
- Set the following values:
- Name: EmailFileImportUtils
- Accessible from: All applications scopes - setting this to all scopes in case you want to use this for a scoped application
- Script: paste in the following:
var EmailFileImportUtils = Class.create();
EmailFileImportUtils.prototype = {
initialize: function() {
},
scheduleImport: function(dataSourceID, transformMapIDs) {
/*
* Create scheduled job to process import
*
* The inbound email action will generate an import data source, however the attachment isn't copied to the data source until
* after the insert of the record. The code below will create a scheduled job to process the import 30 seconds later
* so that attachment has time to be copied to the data source from the email.
*/
var schRec = new GlideRecord("sys_trigger");
schRec.name = "Load Data Source: " + dataSourceID;
schRec.trigger_type = 0; // Run Once
schRec.script = "new global.EmailFileImportUtils().loadImportSet('" + dataSourceID + "', '" + transformMapIDs + "')";
var nextAction = new GlideDateTime();
nextAction.addSeconds(30); // 30 seconds should be enough time however this can be changed.
schRec.next_action = nextAction;
schRec.insert();
},
loadImportSet: function(dataSourceID, transformMapIDs) {
// Get Datasource Record
var dataSource = new GlideRecord("sys_data_source");
dataSource.get(dataSourceID);
// If CSV and header isn't on row 1, recreate attachment with empty rows removed, and delete original
if (dataSource.getValue("format") == "CSV" && dataSource.getValue("header_row") > 1) {
var attachmentRec = new GlideRecord("sys_attachment");
attachmentRec.addQuery("table_sys_id", dataSource.getValue("sys_id"));
attachmentRec.query();
if (attachmentRec.next()) {
var oldAttachmentID = attachmentRec.getValue("sys_id");
var inputStream = new GlideSysAttachment().getContentStream(oldAttachmentID);
var textReader = new GlideTextReader(inputStream);
var ln = " ";
var newLine = "";
var lineCounter = 0;
var headerRow = parseInt(dataSource.getValue("header_row"));
while((ln = textReader.readLine()) != null) {
lineCounter += 1;
if (lineCounter < headerRow) {
continue;
}
if (ln.length > 1) {
newLine += ln + "\n";
}
}
new GlideSysAttachment().write(dataSource, 'Changed ' + attachmentRec.getValue('file_name'), "text/csv", newLine);
//Delete original email attachment so that the new one is processed. If two remain, API may get confused on which one to process.
attachmentRec.deleteRecord();
}
}
// Process data source file
var loader = new GlideImportSetLoader();
var importSetRec = loader.getImportSetGr(dataSource);
var ranload = loader.loadImportSetTable(importSetRec, dataSource);
importSetRec.state = "loaded";
importSetRec.update();
// Transform import set
var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapIDs);
transformWorker.setBackground(true);
transformWorker.start();
},
type: 'EmailFileImportUtils'
};
- If this inbound action is part of a scoped application or if you are loading data in a scoped application change the variable in line 5 to match the scoped application name.
- Click Submit.
If data load is part of a scoped application or if you are loading data into a scoped table and changed line 5 in your inbound email action, then you will need to perform the following steps. If not you can skip to the next step.
By default the Data Sources table only allows records to be created by the Global scope and since your scoped application needs to create a data source via the inbound email action we need to change that.
- Navigate to System Import Sets > Administration > Data Sources.
- Click the Additional Actions 3 lined icon beside Data Sources:
- Then choose Configure and select Table:
- Go to the Application Access Section or tab and check the Can Create checkbox.
- Click Update.
Now test by sending an email that meets the conditional criteria of your inbound email action with your file. Within a few minutes you should see data populated in your table. Keep in mind that the out of the box scheduled job called Email Reader runs every two minutes to check for new inbound emails. This can be changed to run faster, but may cause system performance issues. Once your email is processed it will take another 30 seconds to process the attachment.
If you would like to set up another inbound email action to process a different file, simply repeat steps 1-5 above. The script include does not need to be recreated.
Troubleshooting your setup:
|
Bharath Chintala
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2023 04:06 AM
Hi @Talya ,
There is a good article for this.
Loading data from an email attachment - ServiceNow Community
Hope this helps.