Transforming a huge amount of records using Transform map

thisisauniqueus
Giga Expert

Hi,

I have a requirement where i have to parse a huge amount of records from excel (approx 50,000). I have loaded the records in the staging table and created a transform map. I just want to know will there be any performance implications on the instance. I suspect the transform map may time out during its execution. Your thoughts and comments are welcome.

Best Regards.

4 REPLIES 4

Steven Young
Tera Guru

we do large imports regularly.     Usually we do them after hours and those people who work those hours dont complain.



We do sometime have to import a large amount of data   30k records and i dont notice any slowness in the instance.  


I would definitely recommend to do large imports after business hours, if at all possible.   But like i said, we do some and dont have any issues.


Goran WitchDoc
ServiceNow Employee
ServiceNow Employee

I read somewhere around 10K records, then there might be a performance issue. Do you have a dev instance to test on perhaps? Or is it possible to split the excel into 5 times?



/Göran


drjohnchun
Tera Guru

Is this a one-time task or recurring? Even if it's only one-time, I may consider breaking them into smaller sets. It's not just the number of rows, but also the number of columns and the complexity of the transform maps (scripts, especially lookup scripts, could be slow).



Having said that, I've imported more than 50,000 rows for CI relationships with several columns as a one-time job in the past and didn't run into any issues.



Hope this helps.



Please feel free to connect, follow, mark helpful / answer, like, endorse.


John Chun, PhD PMP see John's LinkedIn profile

visit snowaid


ServiceNow Advocate

Winner of November 2016 Members' Choice Award


thisisauniqueus
Giga Expert

Thank you all, for your time and value-able input. I will break the excel into 5 parts and will run after the business hours.



Best Regards,


JS