Integration
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Hi Community,
I am working in a task where i am getting CSV file from s3 bucket and store it in staging table as an attachment, I want to move this csv file to import set staging table. we receive daily one csv file and want to daily move this. what is best approach.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
5 hours ago
Hello Balkrishna,
That's a common integration scenario. For a daily automated process, here are the best approaches you can consider:
Use IntegrationHub ETL Data Source: This is often the most straightforward method. You can create an IntegrationHub Data Source of type "AWS S3" to connect to your bucket. Then, use an IntegrationHub ETL Import Set to define the mapping from the CSV file directly to your Import Set table. This can be scheduled to run daily.
Create a Scheduled Flow: You can build a Flow in Flow Designer that is triggered on a schedule (daily). The Flow would:
Use an "AWS S3 - Get File" action to retrieve the CSV.
Use a "Transform Data" action or a "Run Script" step to parse the CSV content.
Insert the parsed data into your Import Set table using a "Create Record" action.
Use a Scheduled Script Execution: You can write a scheduled Script Include or Business Rule that uses the AttachmentAPI to read the CSV from the attachment and the GlideRecord API to insert the data into the staging table. This method offers the most control but requires custom scripting.
The IntegrationHub ETL approach is typically recommended as it is powerful, low-code, and purpose-built for this kind of data import task.
Hope this helps!
Thanks & Regards,
Muhammad Iftikhar
If my response helped, please mark it helpful & accept the solution so others can benefit as well.