- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-14-2019 05:39 PM
As per the docs for the onStart transform event ->"When: The onStart event script is processed at the start of an import run, before any data rows are read."
My understanding was it will execute only once despite any number of records I might insert to the import set. But when I tried it, I saw it executes once for each record insertion in import set.
Is my understanding correct that all the event executes as many number of times as input records?
Thanks
Solved! Go to Solution.
- Labels:
-
Scripting and Coding
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2019 07:54 PM
Great discussion so far.
As you might already know, WebService Import sets have two modes, 1.Synchronous 2.Asynchronous
An import set with a Mode of Synchronous will transform the data as soon as it is inserted (provided that the transform map already exists). This import set will also have a default State of Loading. By default, all Synchronousimport sets will automatically be modified to Processed (the state is set to process) at midnight. As a result, when a new insert happens to the same table after the previous importset is marked processed, a new Synchronous import set will be created. Web service import set mode
An import set with a Mode of AsSynchronous will be in a state of Loading until all the records are inserted. Once all the records are inserted [We dont know how to detect it in your case] we have to run the job manually or run it via script [script is what you need]. To run it via a script, you can write an Hourly/minutely scheduled data import. You can find how to create a scheduled data import in the link: Schedule a data import
Now that you already developed logic to make your importset mode as asynchronous. You can find a scheduled job named "Asynchronous Import Set Transformer", this may be inactive by default. You can activate it/create a copy of it and change the repeat interval to every 10 minutes[your desired iinterval] and change the script in such a way that it only checks for your one particular data source.
Try the below script in the scheduled job
transformAsyncIset();
function transformAsyncIset() {
var igr = new GlideRecord("sys_import_set");
igr.addQuery("mode", "asynchronous");
igr.addQuery("state", "loading");//Because in your case the state will never change to "loaded"
igr.addQuery("table_name","YOUR IMPORT SET TABLE NAME");
igr.query();
while(igr.next()) {
sTransform(igr);
igr.setValue('state','loaded');
igr.update();//Set the state to Loaded , so that it wont be captured by the job again and again.
}
}
function sTransform(igr) {
var mapGR = getMap(igr.table_name);
if(mapGR.next()) {
var t = new GlideImportSetTransformerWorker(igr.sys_id, mapGR.sys_id);
t.setProgressName("Transforming: " + igr.number + " using map " + mapGR.name);
t.setBackground(true);
t.start();
}
}
function getMap(sTable) {
var mapGR = new GlideRecord("sys_transform_map");
mapGR.addQuery("source_table", sTable);
mapGR.addActiveQuery();
mapGR.query();
return mapGR;
}
Please mark my eligible responses as helpful/correct if applicable.
Thank you,
Aman Reddy Gurram

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-14-2019 11:38 PM
Hi Aman,
If it is multiple import sets being created, then you might not be able to access any common variable between those transformations.
Would you be able to share what you are trying to achieve? It will help people to suggest you alternatives to achieve that.
From what I think, if there are multiple records being inserted by the Ansible system one at a time, maybe send a common identifier in those records from the source system. Like a set number, an ID for those set of rows, you can use that identifier later in the transform script to build your logic around the imported records under the same identifier.
Hope this helps!
Cheers,
Manish
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2019 06:21 PM
Hi ARG,
when i insert records to import set using API, it inserts the records synchronously and triggers the transform map as soon as 1 record is inserted.
I have written an "onBefore" business rule to make the mode as asynchronous and all my records are reflecting in 1 import set only now. After i run my script, the import set still shows as loading and it kind of doesnt know if all records are inserted.
How do I tell it that all records are inserted?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2019 07:54 PM
Great discussion so far.
As you might already know, WebService Import sets have two modes, 1.Synchronous 2.Asynchronous
An import set with a Mode of Synchronous will transform the data as soon as it is inserted (provided that the transform map already exists). This import set will also have a default State of Loading. By default, all Synchronousimport sets will automatically be modified to Processed (the state is set to process) at midnight. As a result, when a new insert happens to the same table after the previous importset is marked processed, a new Synchronous import set will be created. Web service import set mode
An import set with a Mode of AsSynchronous will be in a state of Loading until all the records are inserted. Once all the records are inserted [We dont know how to detect it in your case] we have to run the job manually or run it via script [script is what you need]. To run it via a script, you can write an Hourly/minutely scheduled data import. You can find how to create a scheduled data import in the link: Schedule a data import
Now that you already developed logic to make your importset mode as asynchronous. You can find a scheduled job named "Asynchronous Import Set Transformer", this may be inactive by default. You can activate it/create a copy of it and change the repeat interval to every 10 minutes[your desired iinterval] and change the script in such a way that it only checks for your one particular data source.
Try the below script in the scheduled job
transformAsyncIset();
function transformAsyncIset() {
var igr = new GlideRecord("sys_import_set");
igr.addQuery("mode", "asynchronous");
igr.addQuery("state", "loading");//Because in your case the state will never change to "loaded"
igr.addQuery("table_name","YOUR IMPORT SET TABLE NAME");
igr.query();
while(igr.next()) {
sTransform(igr);
igr.setValue('state','loaded');
igr.update();//Set the state to Loaded , so that it wont be captured by the job again and again.
}
}
function sTransform(igr) {
var mapGR = getMap(igr.table_name);
if(mapGR.next()) {
var t = new GlideImportSetTransformerWorker(igr.sys_id, mapGR.sys_id);
t.setProgressName("Transforming: " + igr.number + " using map " + mapGR.name);
t.setBackground(true);
t.start();
}
}
function getMap(sTable) {
var mapGR = new GlideRecord("sys_transform_map");
mapGR.addQuery("source_table", sTable);
mapGR.addActiveQuery();
mapGR.query();
return mapGR;
}
Please mark my eligible responses as helpful/correct if applicable.
Thank you,
Aman Reddy Gurram
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2019 08:45 PM
That works perfectly fine for me with some minor tweaks. The only thing that still bothers me is that I cant mark them as Loaded and i kind of have to set it up in such a way that it doesn't cause problem. I have added some more logic in the scheduled job to make sure it doesn't mess up but still making the state as loaded would have been the most perfect and tension free solution for me.
Anyways, thanks a lot for your help man!!! Really appreciate.
Thanks
Aman Soni

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2019 06:42 PM
can you try doing below and see if it run for every record.
(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {
var gs = new GlideRecord('u_item_option_new');
gs.addQuery('u_cat_item','=',source.u_cat_item);
gs.query();
gs.log('Record value is : ' + gs.getValue('u_record_name'));
})(source, map, log, target);