Oscar Lopez
Mega Guru
Mega Guru

 

find_real_file.png

 

ECC Queue

Leverage ECC Queue to create an Output instruction to the MID Server to run a command to execute PowerShell

(function execute(inputs, outputs) {
/************************************************
* Get the name of a Active Mid Server
*************************************************/
var midName;
var eccObj = new GlideRecord('ecc_agent_capability_m2m');
eccObj.addEncodedQuery('capability.capabilityINPowerShell,ALL');
eccObj.addQuery('agent.status', 'Up');
eccObj.query();

if (eccObj.next()) {
    midName = 'mid.server.' + eccObj.agent.name;
}

var filePath = inputs.psname;
  
/* ***********************************************************************/
/* Create output ecc record to execute the powershell file in mid server */
/* ***********************************************************************/
var ecc = new GlideRecord('ecc_queue');
ecc.initialize();
ecc.agent = midName;
ecc.topic = 'Command';
ecc.name = 'powershell ' + filePath;
ecc.queue = 'output';
ecc.state = 'ready';
ecc.source = 'PowerShell';
ecc.payload = '<?xml version="1.0" encoding="UTF-8"?><parameters><parameter name="skip_sensor" value="true"/></parameters>';
ecc.insert();
  
})(inputs, outputs);

 

PowerShell

Use the PorwerShell script to upload CSV/Excel File to ServiceNow Platform via REST API.

The file is uploaded to the Attachment table with correlation to the Data Source record for the import.

##############################################
#####         FILE UPLOAD                #####
##############################################

# Eg. User name="admin", Password="admin" for this code sample.
$user = "integration"
$pass = "queretaro"

# Build auth header
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user, $pass)))

# Set proper headers
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add('Authorization',('Basic {0}' -f $base64AuthInfo))
$headers.Add('Accept','application/json')
$headers.Add('Content-Type','application/json')

# Specify endpoint uri
$uri = "https://dev99999.service-now.com/api/now/attachment/file?table_name=sys_data_source&table_sys_id=c6d53341474b01100d784168f36d436e&file_name=NewUsers.xlsx"

# Specifiy file to attach
$fileToAttach = "c:\report\NewUsers.xlsx"

# Specify HTTP method (POST, PATCH, PUT)
$method = "POST"

# Send HTTP request
$response = Invoke-WebRequest -Headers $headers -Method $method -Uri $uri -InFile $fileToAttach -ContentType 'multipart/form-data'

# Print response
$response.RawContent

 

Flow Designer

Create a Flow to run the import of the file to the Data Source using a schedule.

Use the custom action to run the PowerShell script passing the Path and File Name of the PS1.

find_real_file.png

(Optional) - Clean Import Set table

As optional step in the Flow, before importing the new file in the Data Source, you may want to cleanup the Import Set table and also you will want to delete the current file attached in the Data Source that is no longer required.

find_real_file.png 

 

 

 By Oscar Lopez

@oslovanet

 

 

Comments
Amit Thakur1
Kilo Contributor

Thank you so much for wonderful knowledge Oscar.

I have one requirement, in above case my file name keep changing in directory. e.g "NewUsers_2022_09_08_12_12_001.csv" and i want to pick latest file from directory automatically.

Also, Previous attachment should remove from data source and attach latest.

Is this possible? Thanks in advance.

Regards

Amit

mihirr
Tera Expert

Thanks Oscar for this amazing post.

It was very helpful.

 

Regards,

Mihir Rathod

dj7887
Tera Expert

Hi @Oscar Lopez ,

 

I tried same as you mentioned in video, but when i am executing from ServiceNow flow designer it is showing an access issue, please see below screenshot.

 

dj7887_0-1672814604726.png

 

 

but, when i tried to execute the script from server(where i installed midserver) using powershell, file added to data source.

can you please help me here, what is missing.

 

Thanks.

 

Oscar Lopez
Mega Guru
Mega Guru

@Amit Thakur1 

One option is towrite a Powershell command to move or delete processed File after upload to SNOW.

In the next cycle, you could just run a command to get all *.csv files with powershell, that should give the only one file you need to process, see if you can sort by most recent date and pick the first file from the list, this will give the latest file in the directory to process.

 

To remove the attachment in the Source record within SNOW, built another Flow to run 2 or 4 hours after the first Flow. In this new Flow write the logic to remove the file from the Source record.

Oscar Lopez
Mega Guru
Mega Guru

@dj7887 

You are having an access issue in the directories of the server.

Double check permission of the acccount used by the Mid Server to run the PowerShell. Make sure the account has access to the directories where you have the data file and the powershell script.

Rijeesh Rathna1
Tera Contributor

Hi @Oscar Lopez ,

 

I tried doing the same process but its not working for me. Before I troubleshoot further, I would like to know whether we need an integration hub license to configure this? 

 

My requirement is the same, I would like to copy an excel file from the mid server to the data source record on a daily basis. Can you please help me if you have an alternative to achieve this?

 

Many thanks in advance.

 

Thanks,

Rijeesh Rathnakumar

Rohit7
Tera Contributor

@Oscar Lopez - 

how to perform error handling in this case as response always remains same as below either in case of failure or successful execution of PowerShell script on server host.

 

{
    "Action Status": {
        "code": 0,
        "message": "Success"
    }
}

Regards,

Rohit

 

sambhu
Tera Explorer

@Oscar Lopez 
Thanks for sharing this knowledge.

When we tried following your steps, SN is returning the error copied below. 

Invoke-WebRequest : The remote server returned an error: (401) Unauthorized.
At C:\ps\fileupload.ps1:28 char:13
+ $response = Invoke-WebRequest -Headers $headers -Method $method -Uri  ...
+             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-WebRequest], WebExc
   eption
    + FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand

Any insights into what might be causing this? We tried elevating the roles used by the SN account mentioned on PowerShell but didn't help.

 

Thanks in advance.

Sambhu Hari

ryadavalli
Tera Expert

Hey OScar,

 

This is very helpful. Can we fetch the file from the mid server and attach to any record, for example I want to create a task record and attach the file I picked from mid server?

Also, can we use any existing integration hub spokes for that?

 

Thanks,

Ravali

jparman
Tera Guru

Hi @dj7887 

 

Were you able to solve the issue? I am having the same.

Thanks!

Aman Rajawat
Tera Explorer

@Oscar Lopez My ecc output record in not getting processed and stuck in processing, therefore I am not getting any input record.

ChrisUSC84
Tera Contributor

Can someone elaborate or share their flow for removing the old file from the data set after it has been imported.

Aman Rajawat1
Tera Contributor

@ChrisUSC84 

Create a Schedule Import to import the file from data source to target table then execute a Post-Import Script.

Script
var dataSourceSysId = '____________________'; // give here the data source sys id
var attach = new GlideRecord('sys_attachment');
attach.addQuery('table_name', 'sys_data_source');
attach.addQuery('table_sys_id', dataSourceSysId);
attach.query();
attach.next();
attach.deleteMultiple();

 

This will remove the attachment from the data source after the file has been import.

Cedric Creton
Tera Expert

@sambhu , I have the same issue.. this you solved this at your side?

 

Many thanks in advance

 

Cedric

declannolan
Tera Contributor

Just curious...does the "Copy Files to this Instance" command that comes with the Flow Designer "SFTP Step" action effectively do the same thing as the functionality described above, and should this be the default approach for importing files from an external file server?

Version history
Last update:
‎05-10-2022 09:09 AM
Updated by: