- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-29-2023 06:19 PM
I need to populate an import set through script, mark the rows up with additional values looked up from elsewhere, and then run a transform to push the result into the target table. (Of course I could script all of this, but that's what the map is for.)
The data source, the import set table and the transform map already exist, as these were created when I was planning to have the transform map do everything. I'm looking over the scripting options in Utah, and I see classes like GlideImportSetTable, and instructions like this:
- To execute an Import Set transform:
- Create an Import Set table using the GlideImportSetTable API.
- Create a Transform Map using the GlideImportSetTransformMap API.
- Create or use an existing Import Set using the GlideRecord API.
- Execute an Import Set Transform using the GlideImportSetTransformer API.
But I already have all that stuff. I want to create an import set, populate my import set rows, associate them with the import set, and then hand the import set off to the transformer. But I don't find any code samples of any one doing that, except through the methods shown above. Is there a reason I should not do all that through GlideRecord, by creating a new row in sys_import_set, and so on? It seems strange to create a new import set table and a new map every time I run this import.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-30-2023 09:34 AM
Hi,
Not sure I follow the first part about your excel file but anyway here is a script example. I use scripts like this when I fetch data using REST API calls. This will load data into an existing Import Set and then transform using the connected Transform Map.
import();
function import(){
try {
//Create Import Set
var impSet = new GlideRecord ('sys_import_set');
impSet.initialize();
impSet.mode = "asynchronous";
impSet.state = "loading";
impSet.table_name = "u_import_xxxx";
impSet.short_description = "Importing Data from XXXX";
var impSet_id = impSet.insert();
/*
Populate import set table with data rows - Insert code here
var importData = new GlideRecord(impSet.table_name);
importData.initialize();
importData.u_a = '1';
importData.u_b = '2';
importData.u_c = '3';
importData.sys_import_set = impSet.sys_id; //Sys_id of the newly created import set
importData.insert();
*/
//Finished loading the import set
impSet.state = "loaded";
impSet.load_completed = gs.nowDateTime();
impSet.update();
//Transforming the data
var transformer = new GlideImportSetTransformer();
transformer.transformAllMaps(impSet); //Transform the import set rows
if(transformer.isError()) {
gs.error('Error executing the transform');
}
}
}
catch(ex) {
var message = ex.message;
gs.log('Error: ' + message, 'Import XXXX');
}
}
Is this something that will help you?
Regards,
Niklas
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-30-2023 01:52 AM
Hi,
Creating an import set table and connecting it to transform maps enables you to reuse it.
Adding a record to the import set table will create a new import set record with a single import set row and that will trigger the transform automatically.
By using scripts you can take control of this process and create a new Import Set record, load multiple Import Set Rows into that import set and then when fully loaded start the transform. There is no need to create new Import set tables every time, just new Import Sets records utilizing the same Import Set table.
This gives you a much better view of the execution of the Import Set run compared to just pushing a single record.
Regards,
Niklas
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-30-2023 08:34 AM
Let me make my question more explicit.
I have a situation where I created a Data Source with an Excel attachment. The plan was to mail in an Excel file, attach it to the data source and run a transform. However, a Reference needs to be populated in the target based on a String value in the source row. That lookup has to happen somewhere.
I tried having the Transform Map look up the reference in the transformRow event, but the transform map is stateless, so there's no place to persist the necessary String/Reference mapping. So this method is too slow.
I tried using Data Matchers associated with the String column in the Import Set table, the idea being the matcher would push the Reference into the import set row when it is laid down, and the transform would just copy it into the target. But Data Matchers seem to run asynch, so my transform would run before the matcher did anything. The result was that the target did not get the Reference. (The Reference got pushed into the import set row later.)
(I could have tried telling my script to sleep for a while, to let the matchers catch up, but that seemed unreliable, so I did not try that.)
I conclude from this that I have to detach the creation of the import set from the running of the transform map, so I can push these Reference values into the import set rows myself using some kind of lookup (Decision tables, lookup to a Matcher table, whatever) before the transform runs.
I am not trying to get rid of the transform map - I want it to do as much of the work as possible. It cannot do this String/Reference mapping, but I do need it to map all the values onto the target, and possibly do some other filtering. My plan is to set up an import set somehow, and then call GlideImportSetTransformer.transformAllMaps().
So I was looking around for examples of creating an import set by hand, so to speak, and to my surprise I found no recent examples at all. Is there a reason I should not create a row in sys_import_set using GlideRecord, then create rows in my Import Set Rows table, associating them with the new import set, and then hand it off to the transformer? The documentation on the transformer class suggests this is the way to go:
var importSet = new GlideRecord('sys_import_set');
importSet.short_description = 'Import set from scripted rest api';
importSet.table_name = importSetTableName;
var importSetID = importSet.insert();
var transformer = new GlideImportSetTransformer();
transformer.transformAllMaps(importSet);
That looks right to me. Is there a reason not to create my import set rows the same way, using the already-existing table and, in effect, script the whole import up the point of the transform?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-30-2023 09:34 AM
Hi,
Not sure I follow the first part about your excel file but anyway here is a script example. I use scripts like this when I fetch data using REST API calls. This will load data into an existing Import Set and then transform using the connected Transform Map.
import();
function import(){
try {
//Create Import Set
var impSet = new GlideRecord ('sys_import_set');
impSet.initialize();
impSet.mode = "asynchronous";
impSet.state = "loading";
impSet.table_name = "u_import_xxxx";
impSet.short_description = "Importing Data from XXXX";
var impSet_id = impSet.insert();
/*
Populate import set table with data rows - Insert code here
var importData = new GlideRecord(impSet.table_name);
importData.initialize();
importData.u_a = '1';
importData.u_b = '2';
importData.u_c = '3';
importData.sys_import_set = impSet.sys_id; //Sys_id of the newly created import set
importData.insert();
*/
//Finished loading the import set
impSet.state = "loaded";
impSet.load_completed = gs.nowDateTime();
impSet.update();
//Transforming the data
var transformer = new GlideImportSetTransformer();
transformer.transformAllMaps(impSet); //Transform the import set rows
if(transformer.isError()) {
gs.error('Error executing the transform');
}
}
}
catch(ex) {
var message = ex.message;
gs.log('Error: ' + message, 'Import XXXX');
}
}
Is this something that will help you?
Regards,
Niklas
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-01-2023 12:46 PM
Very good, thank you. I wasn't able to find much discussion of whether this approach is deprecated or not.