How can we execute 20 scheduled data import jobs one after the other ? we should only kick off the second one after first one gets processed
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-13-2018 05:29 PM
We have 20 download URL's and each consist of 100k records in it, So we are able to create 20 data sources dynamically and we are able to download each file and attaching to each data source. (FYI, same transform across all data sources)
So how can we execute each import set one after the other ? We need to load the second import set only after first one gets processed.
any idea to implement would be great
- Labels:
-
Integrations
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎11-14-2018 11:15 AM
Hi, I think you have a few options
Scheduled jobs have a Run : ‘After parent runs’ option and you could create a job for each import and then cascade them via the parent functionality.
Alternately you could might be able to add a condition to validate that the records run in sequence.
You could also look into creating 1 schedule and within that schedule script the creation of the sys_triggers for your imports, however that might tie up a worker for an excessive period of time.
I’d experiment with the OOB After parent functionality and see if it delivers your requirement.
Regards Tony