Clone user and user group tables from PROD to Lower Instance using scheduled job to run weekly

JMD FAROOQ
Tera Contributor

How to clone sys_user and sys_user_group table from prod to lower instance using scheduled jobs ? so that we can run it weekly.

5 REPLIES 5

This could be an approach (scripts are examples!):

 

Step 1: Set Up a Data Export in PROD

  1. Create Export Sets:

    • Go to System Export Sets > Create Export Set.
    • Define export sets for sys_user and sys_user_group tables.
  2. Schedule Export Sets:

    • Create scheduled jobs to export data on a weekly basis.
    • Go to System Definition > Scheduled Jobs.
    • Create a new Scheduled Job and use a script to export data to XML or CSV.
    • Use the following sample script for exporting data:
       

 

var exportUser = new GlideExportXML('sys_user');
exportUser.setRecordLimit(10000); // Adjust the limit as needed
exportUser.setTargetFileName('sys_user_export.xml');
exportUser.setAttachment(false);
exportUser.export();

var exportGroup = new GlideExportXML('sys_user_group');
exportGroup.setRecordLimit(10000); // Adjust the limit as needed
exportGroup.setTargetFileName('sys_user_group_export.xml');
exportGroup.setAttachment(false);
exportGroup.export();
ortGroup.setAttachment(false); exportGroup.export();

 

Step 2: Transfer the Exported Files

  1. Transfer Files via SCP or SFTP:
    • Use a MID Server script to transfer the exported files from PROD to DEV.
    • Script can be run as a scheduled job.
    • Sample script for transferring files:
       

 

var scp = new SCPClient('mid.server.name');
scp.scpToLocal('path/to/prod/sys_user_export.xml', 'path/to/dev/sys_user_export.xml');
scp.scpToLocal('path/to/prod/sys_user_group_export.xml', 'path/to/dev/sys_user_group_export.xml');

 

Step 3: Import Data in DEV

  1. Create Import Sets:

    • Go to System Import Sets > Load Data.
    • Load the XML or CSV files into import sets.
  2. Create Transform Maps:

    • Create transform maps for sys_user and sys_user_group.
    • Map fields accordingly.
  3. Schedule Data Import:

    • Create a scheduled job to import data into DEV.
    • Use the following sample script to import data:
       

 

var importUser = new GlideImportSetTable('sys_user_import');
importUser.setSource('path/to/dev/sys_user_export.xml');
importUser.load();

var importGroup = new GlideImportSetTable('sys_user_group_import');
importGroup.setSource('path/to/dev/sys_user_group_export.xml');
importGroup.load();

 

Step 4: Automate the Entire Process

  1. Combine All Scripts:

    • Combine the export, transfer, and import scripts into one scheduled job or workflow.
    • Use IntegrationHub if available to streamline the process.
  2. Create Scheduled Flow/Job:

    • Create a scheduled flow or job in ServiceNow to run the combined script weekly.

Full Sample Script

Here’s a combined script to automate the process:

 

(function() {
    // Export data from PROD
    var exportUser = new GlideExportXML('sys_user');
    exportUser.setRecordLimit(10000);
    exportUser.setTargetFileName('sys_user_export.xml');
    exportUser.setAttachment(false);
    exportUser.export();

    var exportGroup = new GlideExportXML('sys_user_group');
    exportGroup.setRecordLimit(10000);
    exportGroup.setTargetFileName('sys_user_group_export.xml');
    exportGroup.setAttachment(false);
    exportGroup.export();

    // Transfer files to DEV (Assuming MID Server is used for transfer)
    var scp = new SCPClient('mid.server.name');
    scp.scpToLocal('path/to/prod/sys_user_export.xml', 'path/to/dev/sys_user_export.xml');
    scp.scpToLocal('path/to/prod/sys_user_group_export.xml', 'path/to/dev/sys_user_group_export.xml');

    // Import data in DEV
    var importUser = new GlideImportSetTable('sys_user_import');
    importUser.setSource('path/to/dev/sys_user_export.xml');
    importUser.load();

    var importGroup = new GlideImportSetTable('sys_user_group_import');
    importGroup.setSource('path/to/dev/sys_user_group_export.xml');
    importGroup.load();
})();

 

Schedule the Job

  1. Create a Scheduled Script Execution:
    • Go to System Definition > Scheduled Jobs.
    • Create a new job with the script above.
    • Schedule it to run weekly.

Please mark any helpful or correct solutions as such. That helps others find their solutions.
Mark