- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
I tried in Global scope
and follwing script from https://github.com/goranlundqvist/YouTube/blob/master/Episode%2016%20-%20Import%20data%20through%20R...
works fine
var transformMapSysIDs = '3a4a3c714fb7a300d99a121f9310c70f';//If you have more, separate them with , like 3a4a3c714fb7a300d99a121f9310c70f,3a4a3c714fb7a300d99a121f9310c703
current.name = gs.getUserName() + " UserImport at: " + new GlideDateTime();
current.import_set_table_name = 'u_import_record_producer';//Name of your import table
current.file_retrieval_method = "Attachment";
current.type = "File";
current.format = "Excel";
current.header_row = 1;
current.sheet_number = 1;
current.insert();//Need this since we want to load and transform directly
//Now it time to load the excel file into the import table
var loader = new GlideImportSetLoader();
var importSetRec = loader.getImportSetGr(current);
var ranload = loader.loadImportSetTable(importSetRec, current);
importSetRec.state = "loaded";
importSetRec.update();
//Time to run the the transform with the transform map
var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapSysIDs);
transformWorker.setBackground(true);
transformWorker.start();
//To avoid to create another data source we abort the RP insert.
current.setAbortAction(true);
my question is in scoped application the import set loader and trasnformer worked classes are not available. what is the aletrnative?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
check my blog and enhance the logic
Data load and transform via Catalog Item
💡 If my response helped, please mark it as correct ✅ and close the thread 🔒— this helps future readers find the solution faster! 🙏
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
check my blog and enhance the logic
Data load and transform via Catalog Item
💡 If my response helped, please mark it as correct ✅ and close the thread 🔒— this helps future readers find the solution faster! 🙏
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi @Ankur Bawiskar not exactly the solution I needed but it did Give me idea to use scheduled import for my use case. Thankyou for your reply.
For everyone else who may need this functionality Following is the solution for the scenario, this runs both in Global and Scoped applications and also maintains a trail of what when and where for audit purpose:
1. Create a Record producer with table name as Data Source [sys_data_source].
2. In Portal Settings make atatchment mandatory.
3. in script section add this script
current.name = "Whatever you like , could be anything";
current.import_set_table_name = 'u_you_table_name_could_be_anything';
current.file_retrieval_method = "Attachment";
current.type = "File";
current.format = "CSV";
current.csv_delimiter=',';
current.header_row = 1;
current.sheet_number = 1;
var dataSourceSysId = current.insert();
var gr = new GlideRecord('scheduled_import_set');
gr.addQuery('sys_name', 'Name of your schduled import');
gr.query();
if(gr.next()){
gr.data_source = dataSourceSysId;
gr.update();
gs.executeNow (gr);
}
gs.addInfoMessage("CSV Processed, Creating Incident's.")
current.setAbortAction(true);
4. After this make sure you run this once for try it button ,so it creates the import set table and data source entry for nexts steps.
5. On the import set table create the transform map which will map you field wioth respected fiileds in your traget table, for me it was incident table as target and my import set table as source.
6. Create you scheduled import and select the data source table entry as data source and Name same as in the script we added in recor dprodcuer. Rest of the field could be anything you like. Make sure to make it Active unchecked or false so it doesnt run on it schedule.
final checklist:
1. Record producer in place.
2. Transform Map in place.
3. Scheduled Import in place.
4. import set table exists.
Now you can run the record producer. It will create a data source table entry, and also update you schedule dimport to link to the new data source entry . Data source will keep new data in import set table temporarily. Transform map will run automaticclaly on import set table as soon as new data is there which will transform your data and push to your destination table.