The Zurich release has arrived! Interested in new features and functionalities? Click here for more

curl: (18) transfer closed with outstanding read data remaining

BineshJ
Giga Contributor

When I tried to read data from the table sys_ux_lib_asset that comprises of around 13K records, I am facing an error  

curl: (18) transfer closed with outstanding read data remaining.

 

Command:

 

curl --location 'https://subdomain.com/api/now/v2/table/sys_ux_lib_asset?sysparm_offset=1&sysparm_query=ORDERBY+sys_updated_on&sysparm_limit=500' --header 'Accept: application/json' --header 'Content-Type: application/json' --header 'Authorization: Basic ****' >> response.json

 

Response:

 

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 15.6M    0 15.6M    0     0   246k      0 --:--:--  0:01:05 --:--:--  254k
curl: (18) transfer closed with outstanding read data remaining

 

 

I suspect that the issue is due to the retrieval of huge data from ServiceNow API.

 

Is there a way to read data from the table(using ServiceNow API) without getting the mentioned error?

2 REPLIES 2

Tony Chatfield1
Kilo Patron

Hi, can you use pagination to query your data in smaller 'chunks'?
I can see sysparm_offset and sysparm_limit in your url, so all you would need to do is wrap you code into a loop, set the limit to a smaller value and increase the offset by your 'limit' for each loop.

 

BineshJ
Giga Contributor

Hi,

Thank you for your response.

I am using pagination to query data from the sys_ux_lib_asset table, which comprises about 13K records, with sysparm_limit set to 500. However, I am encountering the issue mentioned previously.

Following your suggestion, I adjusted the sysparm_limit to 50(smaller chunks) and was able to query the data without any issues.

My goal is to query data from various tables. However, determining the appropriate sysparm_limit for each table can be tedious since it may differ from one table to another.

Is there a way to query data using response size as the limit? I suspect the issue is due to retrieving large amounts of data from the ServiceNow API.

 

Thank you.