How to handle multiple files being attached to a Data Source?

Chitra23
Tera Contributor

Hi,

I have a scenario where multiple csv files are being attached to a data source. The files are being placed in a external folder and flow is being scheduled to run daily at a particular time to pick the files from the specified folder. There will be multiples files available and the flow attaches all the files to this DataSource. I also have configured a scheduled import to transform the data. The problem is, only the first attachment is being processed and all the others files are not transformed. It would be great if someone can let me know how to handle this scenario. I'm thinking of creating another flow that gets triggered whenever a file is attached to the datasource. I know that we can use "

GlideImportSetTransformer" to run the transform map using script step in an action. But I don't understand how to load the attachment data into an import set using the script.

1 ACCEPTED SOLUTION

Th_i Ho_ng
Tera Expert

Hi @Chitra23 , you need to write some script to do following actions:

1. From your main DataSource (the one contains multiple files), create multiple datasource with 1 attachment.

2. Do a loop, foreach of your copied datasource, execute GlideImportSetTransformer.

3. Do 1 of 2 following actions to cleanup copied datasource:

a. A new Schedule Job, where you write script to cleanup all copied datasource.

b. Or with the onCompleted transform script of the transform map, trace back to Datasource and delete it.

View solution in original post

4 REPLIES 4

Th_i Ho_ng
Tera Expert

Hi @Chitra23 , you need to write some script to do following actions:

1. From your main DataSource (the one contains multiple files), create multiple datasource with 1 attachment.

2. Do a loop, foreach of your copied datasource, execute GlideImportSetTransformer.

3. Do 1 of 2 following actions to cleanup copied datasource:

a. A new Schedule Job, where you write script to cleanup all copied datasource.

b. Or with the onCompleted transform script of the transform map, trace back to Datasource and delete it.

Chitra23
Tera Contributor

Hi @Th_i Ho_ng , Thanks for your reply.

As suggested, I'm creating a datasource for each of the file(attachment).

I tried to create import_set table using the below script

var importSet = new GlideRecord('sys_import_set');
importSet.short_description = 'Import set table temporary name';
importSet.data_source=<data source record>;
importSet.table_name = "<import set staging table name>";
 importSet.insert();

var transformer = new GlideImportSetTransformer();
transformer.transformAllMaps(importSet);

 

I can see that the import set table gets created but it doesn't have any data in it(I can see that the correct import set table and data source are mapped). It would be great if you can help me with the script for loading the data from attachment into this import set table.

Th_i Ho_ng
Tera Expert

That looks strange to me. Did you try to perform load data and transform map manually with your single csv (any one of them)? You need to ensure there's no problem on your csv file.

Hi @Th_i Ho_ng ,

 

I'm able to load the data into import set and perform transform using script.

 

The solution mentioned in the below link helped me.

https://www.servicenow.com/community/itsm-forum/how-to-automate-loading-of-data-into-import-set-and-...