how to manually create a Staging Table for Import Set API to non-custom Target?

Dave Oshinsky
Tera Expert

I have a JSON file that has been exported from a non-custom table, like "incidents" (which is non-extendable).

 

A staging table for Import Set API must extend "Import Set Row". When I manually try to create a Staging Table, the Web UI forces every field to have name starting with "u_".  Therefore, I'm unable to create the Staging Table having the same fields as the Target Table "incidents".

 

If instead of "incidents", the JSON were from an extendable table like "tasks", that still would not help here. The Staging Table must extend "Import Set Row", which means that it cannot also extend "tasks" (which would avoid the need to create fields that don't have name starting with "u_").

 

"Load Data" can automatically create a Staging Table. However, "Load Data" cannot import from a JSON file (it says "Only files that can be uploaded are file [sic] in CSV or Excel file in .xls or .xlsx formats").  So, "Load Data" cannot be used to load the data (automatically creating a Staging Table suitable for the non-custom Target Table).

 

I'm assuming that the Staging Table must have all fields in the Target Table, so that the JSON (produced from a table with the same fields as the Target Table) can be imported first to the Staging Table, before transforming the rows into rows for the Target Table.

 

What is the best way to create a Staging Table for import to a non-custom table like "incidents" or "tasks"?  "Load Data" does not seem to do the job when the data to be loaded is in a JSON file. Creating a Data Source does not offer the option of creating the Staging Table, nor does it ask key information (like the name of the Target Table) that could be used to determine the fields needed in the Staging Table.

 

I'm really looking for a way to run Import Set API from a JSON file (insertMultiple), including setting up prerequisites like the Staging Table. There appears to be no REST API to set up these prerequisites (like the Staging Table), so I'm trying to understand how to do this manually. Even running it manually, I don't see how to do it for a non-custom Target Table (since ServiceNow forces all field names to start with "u_", which does not match JSON exported from a non-custom table like "incidents"). I have other posts asking about this (with no satisfactory answer), so I'm backing off here to asking about the manual process using Service Now Web UI.

 

Am I missing something fundamental here?

4 REPLIES 4

Tony Chatfield1
Kilo Patron

Hi, the u_ prefix for user created fields is expected behavior when creating any field in an instance, unless the field is being created in a scoped app; but in my experience scoped app imports are easily confused and where a field is not created at table creation, an import will ignore the field if you did not manually prefix it u_  and will instead create another field with u_ prefix the first time data is loaded (at least this is experience from manual file uploads)


Given your scenario I would recommend using a scripted rest api, effectively you post your message to the scripted rest API end point and in the 'script' you map your payload fieldNames to your transform map u_fieldNames
Scripted REST APIs (servicenow.com)
Your process then being payload <> scripted rest end point <> temp import table <> transform map <> target table.

The other benefit of this as a process is that you load data into the temp import table with minimum of interaction\data manipulation, end the integration and then process the source to the target asynchronously which delivers a more reliable integration - especially at time of high message volume.

 

Let's take "incident" as an example for Target Table, and focus on 2 fields: "sys_id" and "severity". The Scripted REST API would map "sys_id" and "severity" in input JSON on-the-fly to "u_sys_id" and "u_severity" for insert into a Staging Table that includes "u_sys_id" and "u_severity" as fields. Now the data originally from "incident" is present in the Staging Table, with all field names having "u_" prepended.

 

I don't see an Import Set API variant that can now be used to move the data from Staging Table to Target Table, "incident" in this case. How would this be done?

 

A more direct approach would be to write a Scripted REST API that would take the original "incident" JSON on a block by block basis, and insert that directly into Target Table "incident". It's not clear to me how this would use server side API's to write directly into "incident". Could such a "direct insert" Scripted REST API be implemented, and what server side primitives are available for such "direct insert" to Target Table?

Hi, I agree a more direct approach would be to push from scripted rest API directly to target record, but as a best practice approach I avoid direct updates of target records via api, as you have limited control over the data and running complex scripts against inbound API messages can result in degradation of the integration as each messages processing time is increased (as opposed to a simple insert and we are finished action). This does result in a double handling\double process but should deliver a more robust and scalable solution; IE greatly reduce the chances of undelivered payloads as a result of the target platform being tied up processing payloads. As a byproduct it also provides an excellent audit trail of received message verses target table result.

Regarding your staging table process question, this would be delivered via OOB import\transform map process with any data integrity and mapping delivered by a before transform script, and all the scripted rest API is effectively doing is aligning the fieldNames between the inbound message and the u_fieldNames of the temp data import table. This maybe not exactly what scripted rest api was intended for, but in my experience it is a better result\solution.

 

If you want to map directly from your scripted rest API to your target table, then you have all of the global (or scoped) methods\functionality available to you that you would have in any other server-side code.

There's a clear tradeoff between having an audit trail (left behind in a Staging Table) and having higher performance (without a Staging Table, with import results only returned in HTTP POST response from the REST API that imported a block of data). In the RDBMS or NoSQL world I come from, the priority is usually on higher performance. There's rarely an audit trail when inserting rows into a database table (without going through some higher level application). In the PAAS world like ServiceNow, the priorities are completely different (prioritizing audit trail over performance). For RDBMS or NoSQL, there are well documented API's so customers can write scripts or programs to import (insert or update) data. For ServiceNow, it's a very manual procedure ("Load Data" etc.), with only a subset of the underlying API's well documented so the import process can be automated, run by a script or program (with no human sitting at a web browser).

 

I've just barely started learning about ServiceNow server side scripting. I see that a basic (to me) API for creating a table is available, but not documented at all:

https://www.servicenow.com/community/developer-forum/create-table-using-a-script-in-a-scoped-applica...

https://www.servicenow.com/community/developer-forum/where-can-i-get-more-information-on-the-glideta...

 

A direct-to-target-table Scripted REST API to import data wouldn't need to create the Staging Table (or Target Table, which must already exist), so perhaps developing that server side script will involve less guess work than using undocumented API's.

 

All of this would be so, so much simpler if ServiceNow simply had documented REST API's to setup the prerequisites for Import Set API (like "create staging table").