- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎09-11-2023 03:11 AM
Hello party-people,
I need to make a snapshot of a table for a customer. The table currently contains 400k+ records. The snapshot is triggered manually via the service portal. This starts a flow that collects all the records in the table and then goes through via for each loop and duplicates into a new table. The execution of the flow currently takes about 8 hours. However, the customer would like a runtime of 2 hours as this snapshot is to be taken multiple times per day.
Does anyone here have an idea if the whole thing via script etc. could be realized with a shorter runtime?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-06-2023 06:40 AM
For all those who have questions,
I did not copy the data records, as this always exceeded the execution time via script or flow. Instead, the data was downloaded via tableapi in packets of 10k records to a MID Server and assembled and processed locally. In the case of my customer, the data was converted via go-lang and imported into an SAP system. The download was archived with a flow action with a PowerShell script.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-06-2023 06:40 AM
For all those who have questions,
I did not copy the data records, as this always exceeded the execution time via script or flow. Instead, the data was downloaded via tableapi in packets of 10k records to a MID Server and assembled and processed locally. In the case of my customer, the data was converted via go-lang and imported into an SAP system. The download was archived with a flow action with a PowerShell script.