Import set on the same tables

ninonino
Mega Contributor

Dear Community,

the scenario is the following: I have to import data on the same table. Generally, I perform an import uploading an excel template (raw data) every time but, as you know, a new staging table is created, and then I have lots of tables among the modules in the navigation bar.

I Suppose that I could do another import on the same table starting from an existing import set table, and then I should rearrange another transform map. Could you please confirm it or give me some suggestions?

Thanks in advance

1 ACCEPTED SOLUTION

Hello, 

If you wish that client / end user himself can upload data without any help of servicenow admin than creating Service Catalog / Record Producer is an alternate solution to Load Data OOB feature.

Data Import Using Service Catalog / Record Producer

1) Service Catalog - Have business rule after insert on RITM table which runs only for that particular Catalog Item.

Record Producer - Create business rule after insert on target table.

Validate file type on RP / Service Catalog itself before submission, if needed

2) Have a data source, transform map and field map created as usual. coalesce etc will be present.

3) Have scheduled import created with this data source run as administrator since it will be triggered from script. In left nav type Scheduled Imports and create new

4) In the business rule condition check whether the RITM / target table of RP has attachment then only run the business rule and have following script

Business rule condition:

current.hasAttachments();


//extra check you can have is whether it is csv or xls based on data source attachment file type

 Business rule script:

//delete all the attachments before attaching a new CSV file to data source
var dataSourceSysId = ''; // sys_id of the data source you created in step 2

var gr = new GlideRecord('sys_data_source');
if(gr.get(dataSourceSysId)){  
   var attach = new GlideSysAttachment();  
   attach.deleteAll(gr);
}      
        //Attach the new CSV after deleting all the old CSV'
GlideSysAttachment.copy('sourcetable', 'sys_id', 'destinationtable', 'sys_id');

var schImp_GR = new GlideRecord('scheduled_import_set');
schImp_GR.addQuery('data_source',dataSourceSysId);
schImp_GR.query();
if(schImp_GR.next()){
   gs.executeNow(schImp_GR); // use this if it is in scoped app
   //SncTriggerSynchronizer.executeNow(schImp_GR); // if in non-scope app i.e. global
   //the above 2 lines will start the import process, etc automatically
   //Please make sure your scoped application has access to delete records from attachment table.
}

 

Copying / Moving an Attachment
Copying all of the attachments associated with a given record is fairly straightforward. You simply call the copy method GlideSysAttachment class, and pass in four strings:

1) The table you want to copy the attachment from (incident, change_request, etc.).
2) The sys_ID of the record you want to copy the attachment from.
3) The table that you want to copy the attachment to.
4) The sys_ID of the record you want to copy the attachment to.


This might look something like this: 

GlideSysAttachment.copy(table_name, table_record_sys_id, sys_data_source, dataSource_sys_id);

Regards,

Surjeet Singh

Please mark the answer as Helpful/Correct if applicable, so that it may help others in future.

View solution in original post

7 REPLIES 7

ARG645
Tera Guru
ninonino,
I don think its possible in the first place. System will create a new import set table every time you load the data from the Data Source. Even if we somehow achieve using the same import set for loading the data every time, what about the data that is previously loaded in the import set table? Every time the transform map runs, it processes all the rows in the import set table and this will process previously loaded rows redundantly. So technically, this is not a feasible process according to me.
 
To tackle this problem, i suggest you to create an on Insert before Business Rule on sys_app_module(Module) to abort the data base  action for a particular Application Menu. This way you dont have to worry about lot of import set tables getting created every time you load the data. 
 
Your modules in application navigator are rendered from the data in sys_app_module(Module) table.
 
Please mark the answer as Helpful/Correct if applicable, so that it may help others in future. If this answer did not help you enough, past screen shots and explanation of what exactly is required. 
 
- Aman Gurram

Surjeet Singh
Tera Contributor

Hi, 

May be the solution can be achieved using Data Source module. Here system will create import set table for the first time when you load data. Every-time you can use same data source with existing staging table to import (raw data) to same target table.

Article below might help you in achieving desired solution: 

https://docs.servicenow.com/bundle/geneva-servicenow-platform/page/administer/import_sets/concept/c_DataSources.html

And if this activity is performed on regular basis perhaps we can create Record Producer to provide feature to bulk upload data on to target tables only when data needs no manual intervention from servicenow Developer or Support.

However, restricting module for a particular Application Menu in Navigation Bar will not stop inserting new staging table every-time when you upload raw data using excel template. Where in Data source you can reuse same staging table with multiple Transform Map options according to set of requirements.

Hope this may help!

Regards,

Surjeet Singh

Please mark the answer as Helpful/Correct if applicable, so that it may help others in future.

ninonino
Mega Contributor

Hello all,

thanks for your quick replies.

@Surjet Singh, what you have described was about my idea as well, that is to start from a staging table and reuse a trasform map also.

Anyway could you plase clarify what do you mean for "we can create Record Producer to provide feature to bulk upload data on to target tables only when data needs no manual intervention from servicenow Developer or Support."

Thanks a lot for your help,

Marco

Hello, 

If you wish that client / end user himself can upload data without any help of servicenow admin than creating Service Catalog / Record Producer is an alternate solution to Load Data OOB feature.

Data Import Using Service Catalog / Record Producer

1) Service Catalog - Have business rule after insert on RITM table which runs only for that particular Catalog Item.

Record Producer - Create business rule after insert on target table.

Validate file type on RP / Service Catalog itself before submission, if needed

2) Have a data source, transform map and field map created as usual. coalesce etc will be present.

3) Have scheduled import created with this data source run as administrator since it will be triggered from script. In left nav type Scheduled Imports and create new

4) In the business rule condition check whether the RITM / target table of RP has attachment then only run the business rule and have following script

Business rule condition:

current.hasAttachments();


//extra check you can have is whether it is csv or xls based on data source attachment file type

 Business rule script:

//delete all the attachments before attaching a new CSV file to data source
var dataSourceSysId = ''; // sys_id of the data source you created in step 2

var gr = new GlideRecord('sys_data_source');
if(gr.get(dataSourceSysId)){  
   var attach = new GlideSysAttachment();  
   attach.deleteAll(gr);
}      
        //Attach the new CSV after deleting all the old CSV'
GlideSysAttachment.copy('sourcetable', 'sys_id', 'destinationtable', 'sys_id');

var schImp_GR = new GlideRecord('scheduled_import_set');
schImp_GR.addQuery('data_source',dataSourceSysId);
schImp_GR.query();
if(schImp_GR.next()){
   gs.executeNow(schImp_GR); // use this if it is in scoped app
   //SncTriggerSynchronizer.executeNow(schImp_GR); // if in non-scope app i.e. global
   //the above 2 lines will start the import process, etc automatically
   //Please make sure your scoped application has access to delete records from attachment table.
}

 

Copying / Moving an Attachment
Copying all of the attachments associated with a given record is fairly straightforward. You simply call the copy method GlideSysAttachment class, and pass in four strings:

1) The table you want to copy the attachment from (incident, change_request, etc.).
2) The sys_ID of the record you want to copy the attachment from.
3) The table that you want to copy the attachment to.
4) The sys_ID of the record you want to copy the attachment to.


This might look something like this: 

GlideSysAttachment.copy(table_name, table_record_sys_id, sys_data_source, dataSource_sys_id);

Regards,

Surjeet Singh

Please mark the answer as Helpful/Correct if applicable, so that it may help others in future.