Identify Duplicate Records in file processed by Transform Map

MPP22
Mega Guru

Hello All,

 

Would like to have guidance/help from expert on the below requirement. Transform map process records row-wise but customer expectation is different as mentioned below, how can we achieve this?

We are using a transform map to either create or update records via a single transform map (The primary key is the "description" column) from an excel file. The customer expectation is now if the source excel file has duplicate records (means the same description in multiple rows in a file), then the system should identify and ignore taking further actions for such duplicate records and give some message/error to the user that this description has duplicate records in a file.

 

1 ACCEPTED SOLUTION

jaheerhattiwale
Mega Sage
Mega Sage

@MPP22 Tried and tested solution.

 

Create a onBefore transform script and add the below code to it.

 

(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {

//Check if the duplicate description is present in current sheet
    var sourceTable = new GlideRecord(source.getTableName());
    sourceTable.addQuery("u_description", source.u_description.toString());
    sourceTable.addQuery("sys_import_set=" + source.sys_import_set.toString());
    sourceTable.addQuery("sys_import_state=inserted");
    sourceTable.query();

//If duplicate is present then log the duplicate entry with import set number
    if (sourceTable.next()) {
        gs.info("Duplicate record ("+source.u_description.toString()+") in the "+source.sys_import_set.getDisplayValue()+" import set run.");

        ignore = true;
    }
    //If duplicate not present then insert record
    else {
        target.description = source.u_description.toString();
    }

})(source, map, log, target);
 
Please mark as correct answer if this solves your issue.
Please mark the answer as correct or helpful based on impact
ServiceNow Community Rising Star, Class of 2023

View solution in original post

3 REPLIES 3

jaheerhattiwale
Mega Sage
Mega Sage

@MPP22 Tried and tested solution.

 

Create a onBefore transform script and add the below code to it.

 

(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {

//Check if the duplicate description is present in current sheet
    var sourceTable = new GlideRecord(source.getTableName());
    sourceTable.addQuery("u_description", source.u_description.toString());
    sourceTable.addQuery("sys_import_set=" + source.sys_import_set.toString());
    sourceTable.addQuery("sys_import_state=inserted");
    sourceTable.query();

//If duplicate is present then log the duplicate entry with import set number
    if (sourceTable.next()) {
        gs.info("Duplicate record ("+source.u_description.toString()+") in the "+source.sys_import_set.getDisplayValue()+" import set run.");

        ignore = true;
    }
    //If duplicate not present then insert record
    else {
        target.description = source.u_description.toString();
    }

})(source, map, log, target);
 
Please mark as correct answer if this solves your issue.
Please mark the answer as correct or helpful based on impact
ServiceNow Community Rising Star, Class of 2023

MPP22
Mega Guru

Thanks, @jaheerhattiwale for the immediate solution. It's working but noticed the below behavior specially first one as the transform script which I built is common for inserting and updating records.

1. If the record does not exist, the first time it will insert, the second time it will update and third time onwards it will start throwing duplicate record errors. - Can we do something here to stop updating same record as same record was already inserted first time ?

2. If record does exist, first time it will update, second time onwards it will start throwing duplicate record errors. - Working as per expectation.

@MPP22 Actually the code should insert only once and second time onwards it should show the log? Because there is no code written for updating the record.

Added ignore=true to avoid updating also.

 

Please mark as correct answer if this solves your issue.

Please mark the answer as correct or helpful based on impact
ServiceNow Community Rising Star, Class of 2023