John Tomko
Tera Expert

I recently had a need to export data from ServiceNow to an Amazon S3 bucket.  I probably could have set up an export set to push the file to a MID server, and then written something to run on the MID server to push that file to S3, but why do that when we have IntegrationHub?

My use case was to capture some information for RITMs created via a specific catalog item.  I needed basic information about the RITM, as well as several of the variable name/value pairs.  I was able to accomplish what I needed using two actions.


Action 1: Generate JSON file

Inputs: None (I could have made the action more flexible, but this is a very specific use case)

Step 1: Script Step

Inputs:

  • None

Outputs:

  • Output File (File Attachment)

For this step, I wrote a script to query the sc_req_item table and push the values I needed into an array.  I created an output variable of type "File Attachment", ran my array through JSON.stringify, and then put the result into the output variable.

Action 2: Upload Generated File to S3

This is a copy of the action "Upload ServiceNow Attachment To S3" from the Amazon S3 Spoke, which is designed to upload a file from the attachments table, which isn't quite what we need.

Inputs:

  • bucket_name (Dynamic Choice, pre-configured for the OOTB S3 action)
  • destination_folder (String, optional folder in the S3 bucket))
  • file_attachment (File Attachment, use the pill from the Generate JSON file action)
  • file_name (String, can be whatever you want the S3 object to be called)
  • file_extension (String, can be whatever you want, but since the previous action outputs JSON, I tend to use .JSON)
  • append_timestamp (True/False, used to determine whether or not to append a timestamp to the file name, before the file extension)

Step 1: Pre-Processing

Inputs:

  • bucket_name (action > bucket_name)
  • fileName (action > file_name)
  • destination_folder (action > destination_folder)
  • file_extension (action > file_extension)
  • append_timestamp (action > append_timestamp)

Outputs: 

  • bucket_name (String)
  • bucket_region (String)
  • file (String)
  • destination_folder (String)

Script (basically the OOTB script, with some additional processing around generating the file name):

(function execute(inputs, outputs) {
    outputs.bucket_name = new AmazonS3Utils().ValidateBucketName(inputs
        .bucket_name);
    outputs.bucket_region = new AmazonS3Utils().getBucketRegion(outputs
        .bucket_name)

    
  	var file = inputs.fileName;
  	
  	if(inputs.append_timestamp){
      var now = new GlideDateTime();
      file += "-" + now.toString();
    }
  
  	if(inputs.file_extension.substr(0,1) != "."){
      inputs.file_extension = "." + inputs.file_extension;
    }
  
  	file += inputs.file_extension;
  
    file = encodeURIComponent(file.trim());
    file = file.replace('~', '%7E');
    file = file.replace('!', '%21');
    file = file.replace('*', '%2A');
    file = file.replace('(', '%28');
    file = file.replace(')', '%29');
    outputs.file = file.replace("'", "%27");
    var des = inputs.destination_folder.replace("//", "/");
    des = encodeURIComponent(des.trim());
    des = des.replace('~', '%7E');
    des = des.replace('!', '%21');
    des = des.replace('*', '%2A');
    des = des.replace('(', '%28');
    des = des.replace(')', '%29');
    des = des.replace("'", "%27");
    outputs.destination_folder = des.replace(/%2F/gi, '/');
})(inputs, outputs);

Step 2: Upload ServiceNow Attachment to S3

This is identical to the OOTB s3 script, but make the following changes:

  • Request Type: Text
  • Request Body [Text]: (drag in the action > File Attachment pill)

Step 3: Post-Processing & Error Handling

(no changes to the OOTB S3 script)


Pretty simple and a nice way to export data without using a MID server or SFTP.

Comments
Prasun Sarkar
Tera Explorer

Do you have the script for step 1?

Version history
Last update:
‎08-19-2021 01:53 PM
Updated by: