
- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
01-13-2022 04:30 AM - edited 03-04-2024 03:36 AM
Back Story
I’m often presented with asks to enable technical users and engineers in other IT departments for the ability to allow them to easily integrate with the ServiceNow platform without a complex project workpiece being needed to design the integration. Most recently, this was with a Microsoft Power Automation piece where there was a pre-bundled connector and just required some credentials. With my security-aware hat on, I wanted to do some digging on what this connector was doing before providing any access as a bit of due diligence any developer should do before integrating.
Simply enough, this shipped connector was going to be using the Table API, one we’re all familiar with and have likely provided access to before. I consider the Table API a bit of a slapdash approach as there is no way for me to (easily) do validation on the data being inserted or updated via the API and there’s an unnerving element of trust being placed in the application to be integrated on dealing with data correctness, error handling etc.
Now you’re probably thinking, “but Kieran, you can just create a Scripted REST API (SRAPI) to deal with data validation, manipulation and upsertion into a table”. Yes, you can, and I have done that exact thing countless times….far too many times. Everytime you create a SRAPI you have to account for the customisation and the support needed for that custom endpoint. Make it too generalised, and it’s not going to be functional to every persons needs, make an endpoint per application/user and you quickly end up with 50+ SRAPIs doing similar things.
What is the Import Set API?
The Import Set API allows for you to push data into an import set table and either asynchronously or synchronously transform it. The API will then either return the results or a sys_id for you to check the results in a separate call.
Import Set Table
To leverage the Import Set API, you need an import set table in place to push data to. How you create this table is up to you as there are a few options:
- Manually (as below) - Provides the benefit of specifying the field names
- Load Data - quick and easy if you have the key:value pair format in a spreadsheet already
- Web Service - Use the SOAP inbound web service creator to auto-create a related import set table
The above is in a scoped application where I have manually created a table that extends the import set row table. Using a scoped app allows me to use simple column names rather than ones with u_ prefix. This makes life a bit easier as it allows for more human-readable payloads later on.
If you want to use a different value as the key for the import field, you can add the import_attribute_name attribute to the dictionary entry. For example, instead of a field being u_email, you can use import_attribute_name=email to effectively rename it for import purposes.
For this example, the transform map for the table is super simple for demonstration purposes. But you can still leverage transform scripts with no issue. For example, your external system might not be able to present a sys_id value but a field-level script can do the necessary GlideRecord based on the inbound data.
Import Set API Access
For the web services user account they’ll need the following permissions to successfully use the Import Set API:
- import_transformer
- snc_platform_rest_api_access - Only needed if you have strict REST API security enabled. I recommend this!
- Access to the import table if you have specific write ACLs (Otherwise access is inherited from the sys_import_set_row table which uses the import_transformer role)
Single Record Import
To insert and transform a single record, a POST request with either a JSON or XML body can be sent to /api/now/import/{staging_table_name} and the data load will occur synchronously.
Endpoint: /api/now/import/x_295070_powerauto_incident
Payload:
{
"short_description" : "Disk Failure Occured",
"description" : "Disk replacement required for server 123",
"impact" : 2,
"urgency" : 2,
"contact_email" : "mary.smith@example.com"
}
Response:
{
"import_set": "ISET0010572",
"staging_table": "x_295070_powerauto_incident",
"result": [
{
"transform_map": "Incident Staging IMP TTM",
"table": "incident",
"display_name": "number",
"display_value": "INC0011139",
"record_link": "https://dev52040.service-now.com/api/now/table/incident/a9cdc1e72ff089106d30206df699b675",
"status": "inserted",
"sys_id": "a9cdc1e72ff089106d30206df699b675"
}
]
}
Multiple Record Import
With the Quebec release, the Import Set API got a bit of an upgrade with the ability to insert multiple records at the same time. This is done by adding /insertMultiple onto the URI. The other change is that the data will be transformed asynchronously.
Endpoint: /api/now/import/x_295070_powerauto_incident/insertMultiple
Payload:
{"records" : [{
"short_description" : "Disk Failure Occured",
"description" : "Disk replacement required for server 123",
"impact" : 2,
"urgency" : 2,
"contact_email" : "mary.smith@example.com"
},{
"short_description" : "Disk Failure Occured",
"description" : "Disk replacement required for server 856",
"impact" : 2,
"urgency" : 2,
"contact_email" : "sam.collins@example.com"
}]
}
Response:
{
"import_set_id": "7b2399272f3001106d30206df699b68c",
"multi_import_set_id": "0c3399272f3001106d30206df699b68d"
}
You’ll note here, we get a much simpler body. If you want the behaviour to be synchronous for the /insertMultiple API, you’ll need to create a record in the sys_rest_insert_multiple table and set the mode to synchronous. Note, this could potentially result in long time API calls and a knock on performance impact. A better option is to use the GET Import Set API operation to get the result for that row using the import_set_id field as the query parameter. Two API calls is more performative than a single API call that holds a transaction for a long time. Remember, your instance can only handle so many API calls at a time, so the longer an API call is, the higher chance you have of hitting that concurrency limit.
Custom Response Message
One of the arguments for creating a SRAPI is the ability to set error messages and custom response payloads. The Import Set API takes care of this potential argument and allows you to extend the response per row.
To do this, use an onComplete transform script and set either status_message , error_message, or a custom value on the response object. The below script demos using these variables:
(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {
status_message = "Imported Row Successfully";
response.contact_sys_id = target.getValue('caller_id');
response.category = target.getDisplayValue('category');
})(source, map, log, target);
Now the response looks like this:
{
"import_set": "ISET0010574",
"staging_table": "x_295070_powerauto_incident",
"result": [
{
"transform_map": "Incident Staging IMP TTM",
"table": "incident",
"display_name": "number",
"display_value": "INC0011144",
"record_link": "https://dev52040.service-now.com/api/now/table/incident/18c815a32f7001106d30206df699b642",
"status": "inserted",
"sys_id": "18c815a32f7001106d30206df699b642",
"status_message": "Imported Row Successfully", //Added by script
"contact_sys_id": "8d7d05a72ff089106d30206df699b6b8", //Added by script
"category": "Inquiry / Help" //Added by script
}
]
}
The error_message variable can only be used in an onBefore as part of aborting the current row import
(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {
error = true;
error_message = "No Assignment Group";
})(source, map, log, target);
{
"import_set": "ISET0010574",
"staging_table": "x_295070_powerauto_incident",
"result": [
{
"transform_map": "Incident Staging IMP TTM",
"table": "incident",
"status": "error",
"error_message": "No Assignment Group; Target record not found"
}
]
}
Import Set API Security - What I do
The following is what I like to do when using the Import Set API as part of security measures to reduce any potential risk:
- Add a custom write ACL to the import set table created to restrict access to only specific users. Without this, anyone with the import_transformer could write to any of the import tables. This might not be desired if you’re triggering a sensitive activity on one of the tables or providing third party access.
- Add a rate limit rule to each table and user to reduce any potentials for floods of tickets. This saves needing to do any checks in an onBefore script and returns an HTTP429 error to the caller.
- Where possible, validate the data input in onBefore script(s) before allowing any inserts or updates.
Takeaways
I personally love this API, even more so as of Quebec with the insert multiple ability. It’s quick, secure and highly scalable and allows for a lot of worries to be easily mitigated. It also allows for a fairly low-code approach which is beneficial to newer ServiceNow professionals.
This API is my default go-to when architecting an integration that requires data inserts. It also works with robust transform maps which makes the ordeal of dealing with CI data via APIs a little bit sweeter.
If you have any questions or use cases, drop them in the comments below!
If you found this article useful, I’d appreciate a thumbs up.
- 20,640 Views
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Great article! The next time I will get into the situation of an inbound integration, I will give that approach a try.
Just one hint: Maybe you should link "Add a rate limit rule" with the respective documentation page, as I'm sure that not everyone is familiar with that important feature.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Good shout - added that info in 🙂
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Fantastic article! I love how you've addressed security, performance, data validation and tech debt at the front of your design process instead of an afterthought. That is the mark of a true professional.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Thank you Matthew! Means a lot coming from Mr Performance himself 🙂
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Great article, but I have some remarks regarding "Multiple Record Import":
1) In your example you seem to use the column name. Unfortunatley by default, it won't work with column name and only column label. You need to explicitly create a configuration to tell ServiceNow to use Column Name for multi insert! This is a common pitfall, as who might expect it suddenly maps based on Column Label? Import Set single insert uses the name, and that is what I would expect as good practice - as labels can always change. You can read here how to tell the API to use column name:
- Create record in "sys_rest_insert_multiple", select your Staging Table as Source Table. Submit
- In related list "Column Mapping" click "New". In field "Column Mapping", select Column name"
2) Your proposal "A better option is to use the GET Import Set API operation to get the result for that row using the import_set_id field as the query parameter." does not seem to work. This is because GET Import Set API would expect the sys_id of the Import Set Row, while multiple Import seems to return the sys_id of the Import Set itself. Is there another out of box API that quickly can get the Status of the Import Set? Instead of needing to have pull, Integration wise I'm thinking a Webhook might be better here? Meaning when the Import Set was completed, then a REST Message is sent to the Requester with all the updates?
3) When I change from asynchronous to synchronous, the REST API Response does not seem to contain Messages regarding created records etc. Is there a way to enable that so it's similar response to Single Insert Import Set API?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Kieran,
I have read your interesting post and I am wondering if you can help me with a question that I just posted on the ServiceNow forum?
URL to post on forum
Hope you have some time to give me some insights how you would handle this use case?
Kind regards,
Kenneth

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Something that was not obvious to me, is after you have defined your table which derives from "sys_import_set_row", AND you have defined a Transform Map which links that table to your target table, AND you have done "auto map matching fields" or defined field maps manually... THEN when the API is utilized, the Transform Map kicks in automatically and stuffs a record into your final target table. I was thinking I would need to run the transform manually or configure or write something extra to hook that up, but it happens automatically. Thanks.
Additionally, when I tried to follow your example, and insert multiple records (ROME version), it wasn't working, and I had a heck of a time figuring out why. In reviewing the documentation here I noticed this:
"You can modify mapping settings by adding an entry in the Rest Insert Multiples [sys_rest_insert_multiple] table and changing the Column mapping from Label to Column name."
Very strange IMO that this is necessary, but I was able to get that working as follows:
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Kieran,
I am using this Import set API Insert Multiple to insert 1 lakh records in staging table from the third party tool but out of 1 lakh only 50k import set rows are getting processed and others are getting stuck at pending and Import set also stuck at pending state.
Can you please help me why this is happening and how can I fix this?
Thanks
Sandeep

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Sandeep,
as part of your POST calls, are you specifying the multi_import_set_id as part of subsequential calls?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Kieran,
No I am not specifying multi_import_set_id. I checked it created 40 multi import sets for 1 lakh records. Please can you help how I can specify in my API.
Thanks
Sandeep

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Sandeep,
Import Set API | ServiceNow Developers- it's relatively easy and just requires adding a query parameter "multi_import_set_id". When you make your first /insertMultiple call, you'll receive a payload as below. On any following calls, simply modifying the URI to include the query parameter as well as the sys_id
{
"import_set_id": "<import_set_sys_id>",
"multi_import_set_id": "<multi_import_set_sys_id>"
}
/api/now/import/{tableName}/insertMultiple?multi_import_set_id={sys_id_from_payload}
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Kieran,
Thanks I will try and let you know if it works.
Regards
Sandeep
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Something I find useful is if you use a Data policy at the Import set table level, you can make fields mandatory on the API without needing to do any scripting. The system will automatically generate an error response too.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Good Article. I am using the Import Set REST API to load 597,704 rcords for all our AD Groups. I do it in 10,000 record chunks in order to avoid data size limits nd timeouts.
The other trick is to use ETCL (rather than ETL). The C stands for 'Compare'. If you compare the records, then you need only load deltas (IUDs) Inserts, Updste and Deletes. I do all of this using a Powershell script. This externalizes all the processing, takers the performance hit away from SN, and allows one to load only a a few hundred records rather than half a million. The Comparison is all done in memory using hash tables - for fast lookups.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
I can now load these half a million record in 30 minutes. With 95% of the processing done externally.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hello Experts.
Using the San Diego release, I tried to use the Multiple Record Import and it did not work for me.
The problem is not so much that it did not work, but that the API replied that it did.
Actually, the Import set was created and the transform map triggered, but as you showed in the description, all I got was a 201 Created return code, with the 2 lines relating to the import sets :
{ "import_set_id": "<import_set_sys_id>", "multi_import_set_id": "<multi_import_set_sys_id>" }
Now, that is obviously not good.
I also tried your trick of using an onComplete script in the Transform map to return some kind of status message, but that did not work either.
So what should we understand here ?
Because using this Multiple Record Import is not so good if we cannot get formal confirmation that all records were processed as expected. Right ?
Thanks for any feedback you can provide.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@Kieran Anson I have the same questions as @Niclas and @ytrottier. This looks promising but as you said the reason for using SRAPI was the ability to customize responses.
Creating a onComplete does not seem to work when you use insertMultiple, having no way of informing my sender if the records where succsessfully inserted/updated or whatever is not good. Have you found a way to do this for insertMultiple? Do you also now where the response body for it is located? Sending the sys_id is useless for them so I would like to modify it.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@ytrottier the response received is the correct one, the reason for this being that the MultipleInsert endpoint is asynchronous, and therefore the results for each row has not yet been actioned when the response is generated.
With the response, you can use the sys_id to query the information from the import set table. This isn't perfect, and if you have an enterprise account I'd ask you to upvote the idea to add an additional endpoint to poll and receive all the results for the MultipleInsert as an array.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@Kieran Anson Correct, this also what I ended up doing. Table API and query the fields I wanted to give them feedback on
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi Kieran.
I took a look at the idea you referenced and I can see that for now, we would have to (until a better solution is available) :
Currently, to get the results of an /insertMultiple async request, you have to use the Table API to get the import set row IDs, and then use the existing GET.
However, looking at the data structure for import sets, it seems to be a bit convoluted to actually get the expected useful results from a multi record import set.
Can you, or anyone else who has done it, share the actual logic to retrieve useful status information following a multi record import ?
At a minimum, we need to know if it was completely successful (or not).
And then, if not, it would be very nice to be able to retrieve a summary of all the different error messages.
Does anybody have that ?

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@yt ccq You have to manually build a query for it (Bear in mind that I set it to synchrounous instead of async for my table. What I did was something along this:
/api/now/table/sys_import_set_row?sysparm_query=sys_import_set={sys_id of import set from payload}%5Esys_transform_map%3D{sys_id of transform map you want to filter by}&sysparm_display_value=true&sysparm_exclude_reference_link=true&sysparm_fields=sys_import_state_comment%2Csys_import_set%2Csys_import_state using both rest api explorer and manually typing some stuff. Giving me this when I query the import_set_id from the payload, I also added a status_message in an onAfter transform script so it gives them some info in sys_import_state_comment:
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Thank you for the details.
I would submit however that having to manage this feedback manually, after the fact, is really not ideal.
The ideal scenario would be as follows :
Given the import is defined as synchronous (see below), we would need to define within the transform map a post processing script that would provide 2 feedback objects that would provide the following summary information :
- The count of all Transform History (sys_import_set_run) import State (Total, Inserts, Updates, Processed, Ignored, Skipped, Errors)
- The count of all error and warning Import Set Rows (sys_import_set_row) State and Comments
Normally, the 1st summary would show that all is good, and the 2nd one would be empty.
Otherwise, the combination of both of those summaries would provide a quite clear picture of what went wrong.
For synchronous imports (from the Import set API doc):
Transformation is asynchronous by default. To set synchronous transformation, create a new record in the Rest Insert Multiples [sys_rest_insert_multiple] table, select the source table, and set the transformation to synchronous.
It is not really clear however what the 2 options "Validate request and Use data source format" really mean.
Hopefully, all I mentioned here is actually achievable in a simple and efficient manner.
And in my mind, that should be the default behavior for multi record imports.
Otherwise, one would be just as well to go back to doing a Scripted REST API and fully manage the entire import process.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
I have opened a case with SN, and at this point, the support agent seem to say that it is impossible to use an onComplete transform script to inject status and feedback into a synchronous Import Set insertMultiple API call.
That does not seem right.
I have asked him to double check.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Actually, onAfter is a per record script execution.
If you have thousands of records to insert, that is certainly too much processing and too much details.
onComplete is executed at the end of the import process, where you can provide a global feedback.
That is much more more effective.
In any case, both should work.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
We are finally at the end of our case about this subject, and here is the case summary :
What I understand from your replies is that you have verified with the development team and there is no way OOTB of managing the insertMultiple Import set API response to include any meaningful feedback about the import itself.
The 2 documented script variables that are expected to write to the API call SOAP response (status_message and error_message - from https://docs.servicenow.com/bundle/sandiego-platform-administration/page/script/server-scripting/ref...) cannot be used in combination with the onComplete script to get the meaningful feedback we are looking for.
That is obviously a significant gap in the usability of that API.
It forces everyone to make multiple API calls to first do the import and then go get the details of the execution.
That said, here are the detailed steps of the workaround that is required for the the insertMultiple Import set API since meaningful feedback is not available OOTB - and again, certainly, it should be :
1- The insertMultiple Import set API is asynchronous by default and does not provide any meaningful feedback.
To get the expected results at the end of the import, set a synchronous transformation by creating a new record in the Rest Insert Multiples [sys_rest_insert_multiple] table, selecting the source table (import set table), and setting the transformation to synchronous.
Also create within the Rest Insert Multiples record a related Column mappings where the Column mapping is Column name (the default is Label).
2- Use the insertMultiple Import set API (POST /now/import/{stagingTableName}/insertMultiple) to insert records into the target table.
Using the JSON data array provided within the insertMultiple Import set API call (see the doc for details : https://docs.servicenow.com/bundle/sandiego-application-development/page/integrate/inbound-rest/conc...), SN will write to the import set table, and run the related transformation to save the records into the target table (2 reads and 2 writes per record).
3- With the import defined as synchronous, it will only complete at the end of the import, even if 1000s of records are to be loaded from the JSON data array.
On completion, the actual result of the import is available in the Transform History table (sys_import_set_run), using the Table API, where the Set field = the returned import_set_id from the insertMultiple Import set API call.
(GET https://xxxxxx.service-now.com/api/now/table/sys_import_set_run?set=[import_set_id from the insertMultiple Import set API call])
If you want to try it out using the REST API Explorer Table API, you will need to update glide.ui.permitted_tables property by adding ",sys_import_set_run, sys_import_set_row_error" to its table list, since by default, system tables (tables beginning with "sys_") are not reportable or accessible by the the REST API Explorer.
4- From the Transform History table response, looking at the [total, inserts, updates, processed, ignored, skipped, and errors] fields, logic will decide if some errors occurred. And if there are, we can get the errors details from the Import Set Row Errors (sys_import_set_row_error) table, using the Table API, where the Run history field = the returned sys_id from the previous Transform History table GET call.
(GET https://xxxxxx.service-now.com/api/now/table/sys_import_set_row_error?sysparm_display_value=true&sys... from the previous Transform History table GET call])
Another option is to always do the Import Set Row Errors (sys_import_set_row_error) Table API call, and if the response X-Total-Count (result row count) is 0, then, there is no error.
Wow. That is quite an involved process.
Clearly that should all be OOTB.
There is an Enhancement Request in the Idea Portal about it : https://support.servicenow.com/ideas?id=view_idea&sysparm_idea_id=6aa1a464db225d5039445ac2ca961940&s...
Feel free to go there and vote for it.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hey Kieran, great article!
There's one thing that confuses me and maybe someone here can clarify to me. I understand all of the benefits of using the import set API but the issue I ALWAYS have is, the source sending the data to ServiceNow NEVER includes the "u_" prefix on the JSON properties in the payload. Which makes sense, why would it? With this said, the import will not be able to map the incoming JSON to the import table columns because they all start with "u_" in the table.
What do you do to work around this? I understand in your example you are in a custom scope and so your import table columns do not start with "u_". This is great but what about in the global scope? Is the recommended best practice to create import tables in custom scopes?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
I am having issues with this. I have this set up exactly as described here. With just a single record json payload and it will not load any data into the import set. Therefore the transform fails as it cannot find the coalesce field value. All syntax looks good and it does create the set and give a response. Just no data. Saw one other person with a similar issue in community but they never got an answer.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@ytrottier Thanks for the great explanation. Any updates from SN regarding the Import Set GET API and all the other points that you have noted?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Great article!
You wrote "It also works with robust transform maps which makes the ordeal of dealing with CI data via APIs a little bit sweeter.".
Unfortunately, I could not find any further information on this, and I am unable to use a created robust transform map after using the import set api. Do you have any further information on this?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Thanks for detailing all that. I have a question, how does one restrict access to the ability to insertMultiple, and only allow one record at a time? As it would seem by adding a segment to the URL opens up a security concern if it's only meant to be used to only insert/update one record at a time per API request call.

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@scottl if there isn't a record in sys_rest_insert_multiple then the insertMultiple endpoint won't process the data being sent. Although only allowing one record update at a time would be a stresser on the instance resources
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@Kieran Anson et al,
Great article! I have been using Table API for integration with external systems since 2019 for various integrations, last one being able to create incidents. I choose not to open up Incident table API instead create import set table to receive data from any source and validate it to ensure data completeness, accuracy and routing (it can be debated but that was my approach).
However I have now been challenged to take this incident creation via import set to another level where attachments can be sent and processes and eventually attached to the incident for teams to look at so that any logs, screenshots etc. from calling systems can be made available while incident creation.
Any thoughts/suggestions on how to approach this without having the calling system having access to the sys_attachment api?

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Great article @Kieran Anson! I am weary of writing SRAPIs and I don't want to take a risk with Table API, Import set API provides some level of control I want to have. However, my only problem with this API is it always returns 201 response code. The consumers will always need to rely on response body attribute where I set the success / fail code. (I know 201 also makes sense since the record was indeed created in import set table).
Did anyone figure out if we could set / change response http code?
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi @Kieran Anson
this is my response
{
"import_set": "ISET91517511",
"staging_table": "u_ibm_rts",
"result": [
{
"transform_map": "IBM RTs",
"table": "sn_vul_vulnerability",
"display_name": "number",
"display_value": "VUL0013067",
"record_link": "",
"status": "ignored",
"sys_id": "8ac9c430873cde10126fb91acebb35ec",
"status_message": "Row transform ignored by onBefore script",
"comment": "8ac9c430873cde10126fb91acebb35ecUpdate Successful. Status is Ignored due to SNOW behavior"
}
]
}
and i want remove below fields from response, how to remove?
"display_name": "",
"display_value": "",
"record_link": ""

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hi @Ambati Shiva Ch , that content is part of the standard output for the Import Set API, I doubt you can remove it.
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
I need to customize the response for Insert Multiple records to give the Incident number in response.
How can i achieve this??
please help?

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
@anuannap123 insert multiple has a fixed response of the import_set_id and multi_import_set_id - An external system would need to "poll" and query the import set table for the state.
If you're needing custom responses, you're likely going to need a SRAPI