Cloning 400k+ rows of data

Leon Tutte
Giga Guru

Hello party-people,

I need to make a snapshot of a table for a customer. The table currently contains 400k+ records. The snapshot is triggered manually via the service portal. This starts a flow that collects all the records in the table and then goes through via for each loop and duplicates into a new table. The execution of the flow currently takes about 8 hours. However, the customer would like a runtime of 2 hours as this snapshot is to be taken multiple times per day.

 

Does anyone here have an idea if the whole thing via script etc. could be realized with a shorter runtime?

1 ACCEPTED SOLUTION

Leon Tutte
Giga Guru

For all those who have questions,

I did not copy the data records, as this always exceeded the execution time via script or flow. Instead, the data was downloaded via tableapi in packets of 10k records to a MID Server and assembled and processed locally. In the case of my customer, the data was converted via go-lang and imported into an SAP system. The download was archived with a flow action with a PowerShell script.

View solution in original post

1 REPLY 1

Leon Tutte
Giga Guru

For all those who have questions,

I did not copy the data records, as this always exceeded the execution time via script or flow. Instead, the data was downloaded via tableapi in packets of 10k records to a MID Server and assembled and processed locally. In the case of my customer, the data was converted via go-lang and imported into an SAP system. The download was archived with a flow action with a PowerShell script.