Inserting multiple rows with a single Import Set API call

Tam_s Holics
Tera Contributor

Hello Everybody,

I have a question regarding the usage of the Import Set API through REST + JSON web service calls. I can successfully use this to import single records to tables (using import set table -> transform map etc.). 
 
However, when I try to create multiple records using one single Import Set API call with the JSON payload below, the Import Set API creates only one record with the data from the first record in the JSON payload.
 
 
[
{
"u_value": "100",
"u_value_text": "Test 100",
"u_ddic_ref": "T000-MANDT",
"u_dynprofld": "S_SCREEN-MANDT",
"u_short_text": "Client",
"u_comment": "Comment 1",
"u_index": "1",
"u_subscreen_repid": "Z123",
"u_subscreen_dynnr": "100",
"u_sap_technical_details": "SAPT0003518"
},
{
"u_value": "200",
"u_value_text": "Test 200",
"u_ddic_ref": "T000-MANDT",
"u_dynprofld": "S_SCREEN-MANDT",
"u_short_text": "Client",
"u_comment": "Comment 2",
"u_index": "2",
"u_subscreen_repid": "Z123",
"u_subscreen_dynnr": "100",
"u_sap_technical_details": "SAPT0003518"
}
]
 
Do I have to use a different formatting in the JSON maybe? Do I have to set something in the Import Set properties or at the Transformation Map?
 
In some cases I'd need to send hundreds of records to the same table at once through the Import Set API, and obviously I don't want to issue hundreds of API calls.
 
Thanks in advance!
 
Best regards,
Tamás
 
1 ACCEPTED SOLUTION

Oleg
Mega Sage

Hi Tamás,

short answer on your original question could be: you can't use Import Set API to import more as one record in one REST call. If you examine the documentation of Import Set API (see here) you will see that the POST request is a classical REST call to insert one record to table. Other close questions on the community forum (see here, for example) will confirm the suggestion. Thus if you need to import multiple records and want (or have) to use Import Set API to send JSON data, then you have to make one separate REST call for importing of one record. If you need to import 100 or 1000 records at one it could be slow (or very slow).

In my answer on your question I tried to guess what kind of data you import and to guess whether the choice of Import Set API is really the best choice or at least good choice in your case.

It's important to understand the goal of Import Set, it's advantages and disadvantages. Import Set is good if you need to import from external source some kine of information, which can already exist in ServiceNow. For example, you want to import emails or users from your mail server. Or you want to import or update information about users from Active Directory to ServiceNow. In the case, local users, which already created on ServiceNow should not be deleted, but email information or some other properties should be updated and new users created on Active Directory should be added. For mapping of users one can use some unique property, like User ID (user_name),  Employee Number (employee_number) or some other property, which is the same on both Active Directory and ServiceNow. So if you need detect, whether update or insert operation is required on importing of data or if you need to detect collision of data in multiple sources then Import Set would be very good choice.

On the other side, the example of data, which you included in the text of your question looks like non-unique data. I suppose that you don't want to detect whether close data already exist in some table and to make update of data instead of insert. I suppose that no data collision should be detected on import of the data. I suppose that you just need to insert hundreds of records into some table. If it's so, then the best choice would be to use Scripted REST APIs to create new REST Endpoint, to which you could send all the data (the information about all the hundreds of records) in one REST call. Inside of your JavaScript code, which will be processed on calling of the REST Endpoint you can just use insert method of GlideRecord in the loop to insert every of the record into the destination table. In the way inserting of hundreds of records will be much more quickly as hundreds of REST calls, which insert only one record at once.

Regards
Oleg

View solution in original post

13 REPLIES 13

Oleg
Mega Sage

Hello Tamás,

I’m not sure, why you decide to use Import Set API. As far I know, Import Set API provides only two simple methods, where only one POST method can be practically used to import ONE record. Import Sets have been introduced long time before mostly for importing data from files and not for usage via APIs. Moreover, Import Set has relatively strong restrictions on the input data. For example, the size of data of one row of import set is restricted to 8126 bytes (see here).

To import multiple rows of data I’d recommend to use Scripted REST APIs.

Best regards
Oleg

Tam_s Holics
Tera Contributor

Dear Oleg,

 

Thanks for the quick reply. In this case I guess I shall write a code like this blog post proposes: https://servicestartsnow.com/2017/10/15/rest-integration-design-part-ii-the-lifecycle-of-a-post/

 

Actually, I received the following advice on this coding: 

"When you create records in import_set_row table and then later query for result, if the transformation is not complete, you might not get the accurate result (this case might be very rare). But with Import Set API being synchronous, the result will be accurate."

Is there a way to ensure that the transformation is complete when reading the record from the table?

 

Cheers,

Tamás

Oleg
Mega Sage

Hi Tamás,

short answer on your original question could be: you can't use Import Set API to import more as one record in one REST call. If you examine the documentation of Import Set API (see here) you will see that the POST request is a classical REST call to insert one record to table. Other close questions on the community forum (see here, for example) will confirm the suggestion. Thus if you need to import multiple records and want (or have) to use Import Set API to send JSON data, then you have to make one separate REST call for importing of one record. If you need to import 100 or 1000 records at one it could be slow (or very slow).

In my answer on your question I tried to guess what kind of data you import and to guess whether the choice of Import Set API is really the best choice or at least good choice in your case.

It's important to understand the goal of Import Set, it's advantages and disadvantages. Import Set is good if you need to import from external source some kine of information, which can already exist in ServiceNow. For example, you want to import emails or users from your mail server. Or you want to import or update information about users from Active Directory to ServiceNow. In the case, local users, which already created on ServiceNow should not be deleted, but email information or some other properties should be updated and new users created on Active Directory should be added. For mapping of users one can use some unique property, like User ID (user_name),  Employee Number (employee_number) or some other property, which is the same on both Active Directory and ServiceNow. So if you need detect, whether update or insert operation is required on importing of data or if you need to detect collision of data in multiple sources then Import Set would be very good choice.

On the other side, the example of data, which you included in the text of your question looks like non-unique data. I suppose that you don't want to detect whether close data already exist in some table and to make update of data instead of insert. I suppose that no data collision should be detected on import of the data. I suppose that you just need to insert hundreds of records into some table. If it's so, then the best choice would be to use Scripted REST APIs to create new REST Endpoint, to which you could send all the data (the information about all the hundreds of records) in one REST call. Inside of your JavaScript code, which will be processed on calling of the REST Endpoint you can just use insert method of GlideRecord in the loop to insert every of the record into the destination table. In the way inserting of hundreds of records will be much more quickly as hundreds of REST calls, which insert only one record at once.

Regards
Oleg

Dear Oleg,

Many thanks for the detailed answer, it is very helpful. The reason why I have to use the import set concept is a company policy at one of our customers. I simply cannot write directly to tables, I must not use the Table API. I thought I could use the Import Set API to send many records at once, but I now understand that this is not possible.

So now I have to use coding in my Scripted REST API to still populate the import set tables, again due to company policy.

Cheers,

Tamás