- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2023 02:43 AM
Hi all,
I have this requirement where I am importing 1,00,000 records into ServiceNow using data source. The csv file which I am importing is around 70MB and it is taking almost 3 hours to transform into target table from import set table. How I can lower this time. Please suggest me.
Thank you.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2023 02:54 AM
I would recommend doing the import during off-business hours.
if possible try to divide the rows from csv into 3 different csv files.
then import 3 files after gap of couple of hours.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2023 02:54 AM
I would recommend doing the import during off-business hours.
if possible try to divide the rows from csv into 3 different csv files.
then import 3 files after gap of couple of hours.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2023 06:08 AM
Thank you Ankur. I will try this
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2023 06:24 AM
Thank you for marking my response as helpful.
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2023 02:56 AM
Hey,
You could schedule the import to run at night when there is low traffic on the Servicenow instance.
And do you have any onBefore scripts for example running on the transform map or even business rules on the target table which may slow down the insertion of new records? It might help to reduce the amount of lookups in those scripts (if there are any) and try to write more efficient code.
In the end however this seems to be a lot of data which naturally needs some time to process.