- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-12-2023 12:34 PM - edited 10-12-2023 12:35 PM
Hi Guys,
Currently, we have this requirement to import 6 Million records into servicenow, and to achieve this we have split these records into 100,000 per file. We tried to import these 100k records in servicenow but it is taking around 3 hours to transfer data from the Import set table to the Target table. SO How can we achieve faster data transfer from the import set table to the target?
While doing research I came to know the multithreading approach, If you guys have any ideas regarding this please share.
Thank you.
Solved! Go to Solution.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-12-2023 01:17 PM
Google is your friend on that one. So is the search box on the ServiceNow Docs site.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-12-2023 12:56 PM
Have you looked at Concurrent Imports. It allows the system to process the data in chunks where each node handles an equal amount of the data.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-12-2023 12:59 PM
Can you share any documents or steps to check/implement this?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-12-2023 01:17 PM
Google is your friend on that one. So is the search box on the ServiceNow Docs site.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-12-2023 01:51 PM
Thank you for this.