- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-14-2017 07:54 AM
In Prod I imported over 25,000 records from Excel. 10,000 records duplicated and now I have 10,000 records that are duplicate of many of the 25,000 already on the table. I need to delete these duplicate entries.
Solved! Go to Solution.
- Labels:
-
Enterprise Asset Management

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-14-2017 07:59 AM
Hi,
You can run one time fix script to delete multiple records from table.
http://wiki.servicenow.com/index.php?title=GlideRecord#Delete_Methods
Regards,
Sachin

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-14-2017 07:59 AM
Hi,
You can run one time fix script to delete multiple records from table.
http://wiki.servicenow.com/index.php?title=GlideRecord#Delete_Methods
Regards,
Sachin

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-14-2017 08:00 AM
Hi Sylvia,
try to run the background script to delete the record and add your condition according to your requirement
var m = new GlideRecord('table');
m.addEncodedQuery('sys_created_onONToday@javascript:gs.daysAgoStart(0)@javascript:gs.daysAgoEnd(0)'); // created on today
m.query();
while(m.next())
{
m.deleteRecord();
}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-14-2017 09:41 AM
Hi Sylvia,
Here's a script that I've used in the past as a kind of generic multiple delete script. You just specify what table you want to delete against and what field and value you want to use to restrict the deletion. Before you run it, you should create property as follows:
- Type sys_properties.list in the left navigator and press enter, if necessary
- Click the "New" button
- Give the property the name, "delete.my.table.key"
- Leave the value blank
- Set type to String
- Set ignore cache to "true" (i.e. check the box)
- Click "Submit"
If you ever want to rerun the deletion go blank out the "Value" field of the property.
Some of the advantages of doing it this way:
- If you run the script inside a reoccurring scheduled job you don't have to worry about your transaction timing out
- You can configure it so that it does the deletes without tracking audit info, business rules, "sys_" field updates - speeds up the deletion
- It handles cascade deletions for records that are referenced in other tables
- It uses the sys_id field to do the query that does the actual deletions, this avoids potential performance degradation do to gap locks or row locks in MySQL.
- If the table is not a candidate for cascade deletes it will bypass the iterative delete behavior and do one big deletion (e.g. DELETE ... FROM X WHERE sys_id IN (...))
//Delete multiple records from large tables in small chunks and batches
//The idea here is that you create a scheduled job that will run on an interval and each time it runs it will delete a bunch of records, but only one batch at a time.
//Depending on how big you set the runSize and chunkSize will determine how many records will be deleted in each run of the job
//For example, if you set runSize to 10,000 and chunkSize 1,000 the job will do 10 chunks or deletions.
//The deletions are done against the sys_id field because in MySQL there is a danger of causing blocking operations on non-primary key fields.
//Find the key value. Tracks progress of what location records have had their audit records cleaned up.
function getSetKey(newKey) {
if (newKey) {
gs.setProperty("delete.my.table.key", String(newKey));
} else {
newKey = gs.getProperty("delete.my.table.key");
if (!newKey) {
newKey = 0;
gs.setProperty("delete.my.table.key", newKey);
}
}
return newKey
}
var keyField = "sys_created_on"; //must be a field with purely chronological order, entered in that order - otherwise we risk an entry sneaking in behind the cleaner. The lowest possible value must be greater than 0 since that is the default key value.
var tableName = "syslog_transaction";
var key = getSetKey();
var runSize = 10000; //max records to delete per run of job
var batchSize = 1000; //number records to be deleted at a time
var batchPause = 200; //number of milliseconds to wait between batches
var ids = []; //array that will hold sys_id's of records to be deleted in a single batch
var deleteCount = 0;
var field = "sys_created_by"; //name of field you want to match against for the delete en mass
var value = "matthew.watkins@snc"; //value that the field must match to be deleted
//The main loop that will be used to glean the sys_id's that we will be updating.
var toDelete = new GlideRecord(tableName);
//Add any additional restrictions on the set of records to be deleted
toDelete.addQuery(keyField, ">", key + "");
toDelete.addQuery(field, value);
toDelete.setLimit(runSize);
toDelete.orderBy(keyField); //defaults to ascending so we start at the back
toDelete.query();
gs.log("Multiple Delete Job: starting at sys_id: " + key);
while (toDelete.next()) {
key = toDelete.getValue(keyField);
var runs = 1;
//put some logic here if you want to further vet the records being deleted by some other related data
if (true) {
ids.push(toDelete.getValue("sys_id") + ""); //collect an array (i.e. "chunk") of sys_id's
}
//Now do the multiple delete against all a single chunk of sys_id's
//But we wait until we have batchSize # of ids or we've reached the end of the run.
if (ids.length > 0 && ((ids.length >= batchSize) || (!toDelete.hasNext()))) {
//gs.print("Number of sys_id's fed into the delete operation: " + ids.length);
//*** critical section ***
var target = new GlideRecord(tableName);
target.addQuery("sys_id", ids); //deletes should be done with sys_id's to avoid indexing gap lock
target.setWorkflow(false); //bypass business rules, workflows and auditing
target.setSysFields(false); //bypass updating the "sys" fields (sys_created_on, sys_mod_count, etc.)
target.deleteMultiple();
//target.query();
//gs.print("records that would have been deleted: " + target.getRowCount());
//*** critical section ***
getSetKey(key);
gs.sleep(batchPause); //pause for X milliseconds between batches
deleteCount += ids.length;
ids = []; //blank out the array
}
}
getSetKey(key);
gs.log("MultipleDelete: finished at " + keyField +" = " + key + ". Deleted " + deleteCount + " records.");
/*
The GlideRecord.deleteMultiple() method is [almost] always an iterative delete. It results in individual database statements to delete each record. There is another way to do it that will be faster but wil bypass all business rules, engines, and cascade delete rules (rules about what to do with records that reference to the deleted records). The following method of deleting multiple records is not supported - you won't find it in any ServiceNow documentation. So use at your own risk. You could put this code into the "*** critical section ***" area above.
var target = GlideMultipleDelete(tableName);
target.addQuery("sys_id", "IN", ids);
target.setValue(field,value);
target.execute();
*/
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-19-2017 09:42 AM
Sylvia,
We are glad you took advantage of the ServiceNow Community to learn more and to get your questions answered. The Customer Experience Team is working hard to ensure that the Community experience is most optimal for our customers.
If you feel that your question was answered, we would greatly appreciate if you could mark the appropriate thread as "Correct Answer". This allows other customers to learn from your thread and improves the ServiceNow Community experience.
If you are viewing this from the Community inbox you will not see the correct answer button. If so, please review How to Mark Answers Correct From Inbox View.
Thanks,
Shivani Patel