What is the count of maximum no of records can be imported with the help of an import set without affecting the system performance?

sudharsanv36
Kilo Guru

For LDAP, ServiceNow recommends 1000 records/ import. Is it the same for normal imports?

1 ACCEPTED SOLUTION

Pradeep Sharma
ServiceNow Employee
ServiceNow Employee

Hello Sudharsan,



Importing Very Large Data Sets : When very large data sets are imported in a single job.


Symptoms: The import job takes a long time to complete.


How to avoid this: Break a very large data set into multiple, smaller jobs for faster results. Consider import sets under 100,000 records as a guideline. For example, importing 10 sets of 100,000 records completes faster than one import of 1 million records even though the total data imported is the same.




http://wiki.servicenow.com/index.php?title=Import_Set_Performance_Best_Practices#gsc.tab=0




I hope this answers your question.


View solution in original post

5 REPLIES 5

Tanaji Patil
Tera Guru

Hi Sudharsan,



We can say lesser the size of import set lesser will be the impact.


Also it depends on the traffic on your instance.



1000 is recommended by servicenow and it should not make any impact.


But if your instance is too busy you can reduce it and go with 500.


You should not increase the size more than 1000.



Thanks,


Tanaji


Pradeep Sharma
ServiceNow Employee
ServiceNow Employee

Hello Sudharsan,



Importing Very Large Data Sets : When very large data sets are imported in a single job.


Symptoms: The import job takes a long time to complete.


How to avoid this: Break a very large data set into multiple, smaller jobs for faster results. Consider import sets under 100,000 records as a guideline. For example, importing 10 sets of 100,000 records completes faster than one import of 1 million records even though the total data imported is the same.




http://wiki.servicenow.com/index.php?title=Import_Set_Performance_Best_Practices#gsc.tab=0




I hope this answers your question.


Hi @PradeepSharma, Hi @TanajiPatil Thanks for sharing info. I have simillar concern. We have large data sets to upload, with multiple files. Data Records#1000K. We are trying to get it File from remote server directory path, using mid server ecc queue pull out and then get added to Data source . 1.If we split Data in multiple files (as suggested records#10000 or 5000 in a file)How should we handle sequence? 2.Best way to load with out impacting performance. 3.Using data we have to build relationships (parent: child). If sequence of file OR what if our child record file loaded before we loading/(creating) parent record in platform. 4.We are trying with no mannual intervention on data storing on remote server, pull using mid ecc, attached to Data source? Will it good idea using mid? Any other option would be great. Thanks Sharad Wiki link is not accessible. http://wiki.servicenow.com/index.php?title=Import_Set_Performance_Best_Practices#gsc.tab=0

sudharsanv36
Kilo Guru

Thanks Pradeep & Tanaji..