The CreatorCon Call for Content is officially open! Get started here.

Parse CSV attachment and insert data into staging table

Nancy19
Giga Contributor

Hello All,

I have a case where user uploads multiple .csv files into table and I need to parse the .csv files and add the data into staging table. After that transform map runs and adds data into actual table.

Can someone let me know how can I parse the csv content and create records in staging table using script. I cannot using import set and data source as user does not upload the file into data source. The file is uploaded as attachment.

 

Any help is appreciated!

Thanks,

Nancy

6 REPLIES 6

Sam Ogden
Tera Guru

Hi Nancy,

Goran Lundqvist did a couple of videos on YouTube giving a couple of examples of uploading CSVs and getting the data.  One was uploading a CSV to a record producer with a script to trigger the data source.  The second was using flow designer to achieve the same but with the ability to upload multiple CSVs - this time using a standard change item as an example.

Record producer example

Flow designer example

I Hope these help in what you are trying to achieve

Thanks

Sam

Ashutosh Munot1
Kilo Patron
Kilo Patron

Hi,

I can help you on this. Can you answer this question?

1) Where people will upload the file?

 

Thanks,
Ashutosh

Hi Ashutosh,

The users will upload the file using upload data UI action and the files will be attached to a table. But as the attachments are stored in sys_attachment table. I can query the sys_attachment table using the sys_id of my table and attachment ID and fetch the contents into a script.

Hi Nancy,

I would recommend this rather than parsing the csv file using script

I assume user will add some 2/3 unique files to the table record

1) have individual data sources for those many csv files; if user will be attaching 2 files then 2 data sources should be present in instance

2) I also assume file names will be unique in the sense you can determine which attachment to use for which data source

3) have regular data source, transform map and field maps i.e for the target table

4) have a scheduled import and attach the data source created; let this be active=false since you will be triggering it from script

5) have a UI action on the table where user attaches the file as Start Data Process

6) in this UI action script you need to do following

a) copy attachment from this table to the individual data sources

b) trigger both the scheduled import if they are not dependent on each other

Mark Correct if this solves your issue and also mark 👍 Helpful if you find my response worthy based on the impact.
Thanks
Ankur

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader