- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-16-2025 03:28 PM
I have an Excel import that needs a lot of value mapping before it goes to the target table. Certain String columns in the source file are mapped to references, and certain additional String values are associated with the selected reference. All this is done with Decision Tables, via script.
The entire import is done with script:
-- Reading the source Excel file
-- Parsing out an array of objects, one per row
-- Marking up the object with properties (sys_ids, Strings, etc) to be pushed into the target table, or marking it 'skipped'
-- Create an import set
-- Inserting update set rows for objects not marked skip
-- Calling the transform
The import set doesn't do much beyond decide whether this will be an insert or an update, based on the key column.
The reason for all this scripting is the Decision Tables. I don't want to instantiate them all over again for each row, so m script instantiates them once, reads out all the mappings into one object, and then marks up each data object, getting it ready for insertion into the import set. Doing that work in the transform would be inefficient.
All that is very simple, but not especially flexible. If the business needed to add an other Decision Table for additional mapping I would have to update my import script.
To reduce technical debt I want to move the value mappings into a Flow and use the Make A Decision action to fetch the values for the target table. (I recognize that this poses a problem for efficiency, as I mentioned. I will have to look closely at that when I get there.)
So I see the future state as roughly this:
--New file arrives
--I fire an event that tells a Scheduled Data Import to run now. <-- this is the part I don't understand
--The import creates a update set, one row per Excel row - and nothing else (no transform, not yet)
--I get a handle to the new import set - somehow
--The import set is passed to a Flow to mark up the various import set rows, or marked them to be skipped, according to the DTs
-- The flow runs a transform to update the target table
I don't want to run the Scheduled Data Import on schedule, I want it to run as soon as a file arrives. So I tried setting to to Run Once, and updating the run_start programmatically.
var gr = new GlideRecord('scheduled_import_set');
gr.addQuery('name', 'MyImport');
gr.addQuery('sys_scope', '3263dedc47ff5610ded2d88b416d43fa');
gr.setLimit(1);
gr.query();
gr.next();
var gdt = new GlideDateTime();
gdt.addSeconds(1);
gr.run_start = gdt;
gr.update();
But nothing happens and no import set is created.
Can anyone advise me how to kick off this Scheduled Data Import from a script, and get the resulting import set?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-16-2025 07:19 PM
you can run it like this
use this line -> gs.executeNow(gr);
var gr = new GlideRecord('scheduled_import_set');
gr.addQuery('name', 'MyImport');
gr.addQuery('sys_scope', '3263dedc47ff5610ded2d88b416d43fa');
gr.setLimit(1);
gr.query();
gr.next();
var gdt = new GlideDateTime();
gdt.addSeconds(1);
gr.run_start = gdt;
gr.update();
gs.executeNow(gr);
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-16-2025 07:19 PM
you can run it like this
use this line -> gs.executeNow(gr);
var gr = new GlideRecord('scheduled_import_set');
gr.addQuery('name', 'MyImport');
gr.addQuery('sys_scope', '3263dedc47ff5610ded2d88b416d43fa');
gr.setLimit(1);
gr.query();
gr.next();
var gdt = new GlideDateTime();
gdt.addSeconds(1);
gr.run_start = gdt;
gr.update();
gs.executeNow(gr);
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-17-2025 07:26 AM
Thank you, I will try that.
The other half of my question was, how do I get a handle to the resulting import set. I will try adding a business rule that fires off an event when a sys_import_set whose data source is <my data source> changes state to 'loaded'. That will let me go in and mark up its import set rows according to my various rules, and finally run a (very simple) transform to push the data into the target. The map will do little/no fiddling with the data, as I want to keep all mapping rules out in the open.