CSV file row size limit

trinadh2
Giga Guru

Hi All,

I have a issue with the load data, we are picking the csv file from a shared location. As per the raw file we are almost getting 16000 records. but when we check the import sets we can only see around 3000 records are picked and placed in the staging table. Was there any row limit? i have checked in the import export module and i couldnt find any limit there. 

 

@Ankur Bawiskar could you please help on this. 

 

 

Thanks in advance.

1 ACCEPTED SOLUTION

@trinadh2 

so unless you replicate the issue in lower instance test/sub prod you cannot debug it

 

Regards,
Ankur
✨ Certified Technical Architect  ||  ✨ 9x ServiceNow MVP  ||  ✨ ServiceNow Community Leader

View solution in original post

5 REPLIES 5

Mohith Devatte
Tera Sage
Tera Sage

hello @trinadh2 this is because we have a limit for importing which is controlled by an system property called as com.glide.attachment.max_size 

 

Max of 1GB data can be imported at a time This property restricts the overall size of the the attachment and applicable on all formats including XML,json etc.

The default value will be 1024 MB which is 1GB

Hope this helps 

mark my answer correct if this helps you 

Thanks

 

Hi @Mohith Devatte ,

thank you for the response. I believe that this property is for the attachment. Would this be the same property for the files that is getting picked from a shared file.

Ankur Bawiskar
Tera Patron
Tera Patron

@trinadh2 

16000 is not that huge. it should work fine.

try loading it manually and verify

Regards,
Ankur
✨ Certified Technical Architect  ||  ✨ 9x ServiceNow MVP  ||  ✨ ServiceNow Community Leader

@Ankur Bawiskar 

We tried loading the data in our dev instance and also we were able to execute the schedule job and it got picked manually/automatic. which is working fine in the dev instance. But we are seeing this issue in production instance even when we executed the shcedule job manually by clicking on execute now it is not picking up all the records.