Create Import Set from an already existing extended table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-02-2017 04:37 PM
Greetings,
I have created a scheduled job that populates an extended table (u_temp_server which extends the "server" table) in my instance with data received as a JSON response from an external source. Initially I tried to populate an import set table (u_dc_server_import) with the script that parses the json response, but many of the fields failed with nullPointer errors even though the column definitions (that I created manually) were identical. Inserting this data into u_temp_server works great however, likely because the data I'm ingesting is coming from another ServiceNow instance's server table. So, now I'm trying to use a transform to update fields in a cmbd table from u_temp_server, but I cannot use the table in the transform as it is not an import set table.
What the best way is to create/update an import set table from another table (u_temp_server)? Or would it be smarter to skip the import set table/transform and update the cmdb table directly via gliderecord?
Any advice or pointers is appreciated.
Thanks!
Ryan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-02-2017 05:48 PM
Hi Ryan,
I hope you are getting the JSON data from external source through a Table / Scripted API. Please correct me If am wrong.
An Import Set table needs to extend 'Import Set Row' table which you cant because you are extending another table already. I would suggest to leave the Table as it is, and create Business Rules on 'u_temp_server' to migrate data from u_temp_server to CMDB table and keep the status of migration on u_temp_server table itself.
Hope this helps. Mark the answer as correct/helpful based on impact.
Thanks
Antin
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-02-2017 09:04 PM
Thanks Antin. Yes the data is coming via a scripted API request that returns a json response that is stored into temporary object before being parsed and written to u_temp_servers. Here is a cleansed snippet of the current script (the table has >100 columns so I won't paste the whole thing) that requests the json data and parses it into the u_temp_server table (this is working great, but has issues writing to the import set table):
requestServers : function() {
var request = new sn_ws.RESTMessageV2();
request.setEndpoint('https://url.service-now.com/api/now/table/cmdb_ci_linux_server?sysparm_limit=250&sysparm_query=u_rac...
request.setHttpMethod('GET');
var user = 'api.user';
var password = 'password';
request.setBasicAuth(user,password);
request.setRequestHeader("Accept","application/json");
var response = request.execute();
if(response.getStatusCode() == "200")
{
var responseBody = response.getBody();
var responseObject = new global.JSON().decode(responseBody);
for(var i=0; i<responseObject.result.length; i++){
var gr = new GlideRecord('u_temp_server');
gr.initialize();
gr.asset_tag=responseObject.result[i].asset_tag;
gr.assigned=responseObject.result[i].assigned;
...
gr.insert();
}
}
Appreciate the suggestion. That was one option I was considering but I need to figure out the logic to create a transform based on a coalesced field. I don't want to change sys IDs etc on the existing data in the CMDB table, only insert new records and update existing (like if an IP address changes or the server is decommissioned). I think a conditional statement that checks for a match on my field of choice should be enough to figure out what the right thing to do is (and I assume this is what the built in transform tool does anyway). In this case, I'm kind of thinking I should just add logic to the above script to insert/update directly to the cmdb table and not worry about storing things temporarily for a transform (but this makes my spidey sense tingle and feels dangerous to me).
The other option I was considering was to write a script include or business rule, (I'm calling it in s scheduled job, so figured script include) to import the data from u_temp_servers to my import set table (u_dc_server_import), and assuming that works, then perform the transform into the cmdb table. Seems like more work up front but maybe not if writing a script that does the right thing is a pain. This would also allow multiple points of entry to easily manage the transform down the road if needed.
Thanks again for your suggestion. You definitely got me leaning that direction. I will post back on this thread if/when I get it working. I never knew something so simple could end being so complicated!
-Ryan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-02-2017 10:47 PM
Hi Ryan,
Thanks for all the details. I clearly see you are the right track.
As you correctly pointed out, whatever logic Transform Map has on coalesced field can be achieved by scripting too. But main advantage with Import Set and Transform Maps are the status tracking and logging, which you would miss if you don't have any temporary table and directly insert into CMDB tables. With the status being tracked either on u_temp_servers table or the Import Set table will help you a lot for maintenance and retry logics.
So, if cant move the data to an Import Set table, I would suggest to keep it in the u_temp_servers table with required status fields and retry logics to move to CMDB table.
Thanks
Antin