How to transfer the changes which don't get captured in update sets?

SandeepKSingh
Kilo Sage

Can anyone help me with Script ?

1 ACCEPTED SOLUTION

Ravi Gaurav
Giga Sage
Giga Sage

Hi @SandeepKSingh 
Here are two ways:
1. Transfer record via XML export.
2. We can use GlideUpdateManager2 API script as shown below which allows us to capture any record in current update set:

var um = new GlideUpdateManager2();
um.saveRecord(current);
//current is gliderecord object, you can pass any other gliderecord object if required.

Note : This script needs to run in background script.

--------------------------------------------------------------------------------------------------------------------------


If you found my response helpful, I would greatly appreciate it if you could mark it as "Accepted Solution" and "Helpful."
Your support not only benefits the community but also encourages me to continue assisting. Thank you so much!

Thanks and Regards
Ravi Gaurav | ServiceNow MVP 2025,2024 | ServiceNow Practice Lead | Solution Architect
CGI
M.Tech in Data Science & AI

ï”— YouTube: https://www.youtube.com/@learnservicenowwithravi
ï”— LinkedIn: https://www.linkedin.com/in/ravi-gaurav-a67542aa/

View solution in original post

7 REPLIES 7

Any possibilty of Script.. I was looking for Script instead of writup

Use this script to collect the data:

 

var tableName = 'your_table'; // Replace with the table name
var chunkSize = 100; // Number of records per chunk
var gr = new GlideRecord(tableName);
gr.query();

var dataChunk = [];
var recordCount = 0;
var chunkCount = 1;

while (gr.next()) {
    var record = {
        sys_id: gr.getValue('sys_id'), // Add fields you need
        name: gr.getValue('name'), // Replace 'name' with your field
        description: gr.getValue('description') // Replace with your fields
    };
    dataChunk.push(record);
    recordCount++;

    // If chunk size is reached or it's the last record, log the chunk
    if (recordCount % chunkSize === 0 || !gr.hasNext()) {
        gs.info('Chunk ' + chunkCount + ': ' + JSON.stringify(dataChunk));
        dataChunk = []; // Clear the chunk
        chunkCount++;
    }
}

 

Steps:

  1. Run this script in Background Scripts on the source instance.
  2. Copy each chunk of JSON data from the logs.
  3. Save them as multiple JSON files (e.g., data_chunk1.json, data_chunk2.json).

 

Use this Script to Transfer data:

 

// Example JSON data chunk
var dataChunk = [
    {
        "sys_id": "abcd1234efgh5678ijkl9012",
        "name": "Record 1",
        "description": "This is record 1."
    },
    {
        "sys_id": "mnop1234qrst5678uvwx9012",
        "name": "Record 2",
        "description": "This is record 2."
    }
];

dataChunk.forEach(function(record) {
    var gr = new GlideRecord('your_table'); // Replace with the target table name
    if (gr.get('sys_id', record.sys_id)) {
        // Update the record if it exists
        gr.setValue('name', record.name);
        gr.setValue('description', record.description);
        gr.update();
    } else {
        // Insert a new record if it doesn't exist
        gr.initialize();
        gr.setValue('sys_id', record.sys_id);
        gr.setValue('name', record.name);
        gr.setValue('description', record.description);
        gr.insert();
    }
});

gs.info('Data chunk imported successfully!');

 

This script imports JSON chunks one at a time. Make sure you upload the JSON files to System Definition → Scripts → Script Includes.

Hi @yuvarajkate 
how you are writting so fast and vage answer.. the above answers doesn't resemble to the questions I asked.. 
Its just a bad answer.. without reading the content