Shamma Negi
Kilo Sage
Kilo Sage

Hi All,

 

Today I would cover the Concurrent Import Sets.

 

Topic: Concurrent Import Sets

Month: November

 

With the help of concurrent import set we can easily break the entries into small chunks and process the all the chunks simultaneously as shown below:

 

ShammaNegi_0-1731592072285.png

 

 

 

How does it break the import sets into small chunks:

 

ShammaNegi_1-1731592195973.png

 

 

This breaks the import set into multiple import sets which runs at the same time in different nodes at the backend and helps processing the data faster as compared to single Import set. Example, processing 100,000 records via Import set process this number of records with one import set however with the concurrent import sets it divides it into n number of import sets.

 

Live Update on Concurrent Import Set Processing Speed. How much time we could save from it.

 

We could upload 100,000 records and it took 3 hours to upload with the efficient import load. But remember this time doesn’t include any validations in data or any logics to be checked with existing data. However, with an import sets it took more than 10 hours to upload.

 

For more information: Do checkout the docs below on the above topic:

 

Concurrent imports

 

Hope this helps you.

 

Regards,

Shamma