best practise to bulk upload over 100000 records to a table

Vasudhevan2
Tera Contributor

Hi All,

 

I have a requirement to upload more than 100000 records into incident table with incident number as unique value and update another field. only these two fields are needed to be there on the template. I would like to get your expert opinion on the best way to achieve this.  if import set is the option is there any limit that I should mention on the number of entries per import?

5 REPLIES 5

Dr Atul G- LNG
Tera Patron
Tera Patron

Hi @Vasudhevan2 

To do this, break the record load into chunks. Don’t try to load too many records at once. Aim for batches of 5,000–10,000 records maximum per run.

Make sure to:

  • Set the import set correctly

  • Use a transform map with coalesce configured properly

  • If the Excel file contains any kind of reference data, ensure that it’s added correctly—otherwise, the load will fail.

Also, one more thing:
Why do you need to add so many incident records manually? Which environment is this for? Can’t you clone it instead?"**

*************************************************************************************************************
If my response proves useful, please indicate its helpfulness by selecting " Accept as Solution" and " Helpful." This action benefits both the community and me.

Regards
Dr. Atul G. - Learn N Grow Together
ServiceNow Techno - Functional Trainer
LinkedIn: https://www.linkedin.com/in/dratulgrover
YouTube: https://www.youtube.com/@LearnNGrowTogetherwithAtulG
Topmate: https://topmate.io/atul_grover_lng [ Connect for 1-1 Session]

****************************************************************************************************************

Hi Atul,

 

Thank you for response.

 

This has to be done in Prod and this  is a new field getting introduced. The idea is to populate this and use it for Prediction using ML for new records. This is to segregate record based the new field value and use it also for reporting.

Hi @Vasudhevan2 

So, in this case:

  • Let’s add this field only to open incidents first. Add a flyer and then add the future field called "created date" so it doesn’t impact system load.

  • For the second case, take only incidents in the “resolved” state.

  • And for the last section, take incidents that are closed.

When you can’t get the segments sorted properly, you need to test it in the dev environment first.

Download the data using this template, making sure the fields you need are in the view.

Update the data accordingly. (Also, note that you can’t download more than 10,000 records at once — you’ll need to change the import limit first.)

Then upload the Excel sheet again.

 

DrAtulGLNG_0-1757604800466.png

 

 

DrAtulGLNG_1-1757604840605.png

 

 

*************************************************************************************************************
If my response proves useful, please indicate its helpfulness by selecting " Accept as Solution" and " Helpful." This action benefits both the community and me.

Regards
Dr. Atul G. - Learn N Grow Together
ServiceNow Techno - Functional Trainer
LinkedIn: https://www.linkedin.com/in/dratulgrover
YouTube: https://www.youtube.com/@LearnNGrowTogetherwithAtulG
Topmate: https://topmate.io/atul_grover_lng [ Connect for 1-1 Session]

****************************************************************************************************************

Bhuvan
Kilo Patron

@Vasudhevan2 

 

Go through below post where I explained in detail on this,

 

https://www.servicenow.com/community/developer-forum/bulk-data-transfer-to-external-system/m-p/33760...

 

At a high level, we imported ~4.5 million records as part of third-party integration with ServiceNow. We tested import set could handle ~950,000 records in a single load after which import sets get hung and it could not process the data. We split 4.5 millions records into 5 different CSV files and imported in daily batch for 5 days after office hours for initial data load and established weekly sync for delta updates via MID server.

 

I would recommend you to split it to 2 files and import it and it should work. Make sure to have coalesce field as part of transform map and use import set table and not directly to incident table. Plan this activity during offline hours as it takes long hours to complete due to large number of records. Since you have only 2 fields, it could take less time but test it thoroughly in development environment first and migrate it.

 

I hope you appreciate the efforts to provide you with detailed information. As per community guidelines, you can accept more than one answer as accepted solution. If my response helped to guide you or answer your query, please mark it helpful & accept the solution.

 

Thanks,

Bhuvan