- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-10-2017 11:26 AM
Hi,
I was wondering how to integrate Multiple SCCM with ServiceNow. For one, there is a direct setup option in the module. and it hold information of 1 SCCM database but i need to integrate with another SCCM from different database to same ServiceNow Instance. Can anyone please advise.
Thank you
Solved! Go to Solution.
- 6,054 Views

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-10-2017 11:45 AM
Hi Saketh,
This can be done by duplicating Data Sources.
You would need to do this for all CI classes in scope.
To avoid setup configuration overwrite the duplicated Data Source, you would need to De-Activate the Business Rule on Setup names "Update Data Sources".
You can have 1 SCCM setup in Setup module but in Data Sources module you can configure 2nd SCCM.
White testing connection you may receive error but separately these 2 configurations would work fine.
Scheduled Jobs can be accordingly modified to meet the requirement of configuring "Multiple SCCM".
Regards,
Aditya Shrivastava
PS: Please Endorse correct/helpful/like if you find response useful in anyway. Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-04-2018 12:51 PM
Saketh,
I would like to add something else you need to watch out for.. YEs you can create new data sources for the other SCCM DB but you also want to set the source to be its own. Reason is, if client wants to know what devices are coming from what SCCM DB you can run a query against the source. Second you need to update the "sys_object_source" table with the SCCM value. Say you have SCCM 1 and SCCM 2 you could update the transfer map to save the source value on the sys_object_source table. This is key because each SCCM have dup resource IDs and you don't want SCCM 1 updating a computer from SCCM 2 that has the same resource ID.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-08-2019 09:58 AM
Patrick is correct. I ran into this and had to make a change to the Identity Transform map in both the coalesce field map script and onAfter transform script. (Change the IMPORT_NAME value to something else and match the newly changed value to replace the onAfter script 'SCCM' value to new value where it has: new ObjectSource('SCCM',
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-31-2019 11:33 PM
I found a much easier way to handle this which involves appending the database name to the resource ID during the transform and dynamically updating the data sources based on a table that contained the DB name, IP address, active status and order. I needed to integrate 4 SCCM DBs. Several of the transforms have a cmdb_ci_computer sys_id script based transform that you can simply change to answer = SCCMHelper.findComputer(source.u_resourceid+'-'+source.sys_import_set.data_source.database_name); in order to append the current DB name to the resource ID in order to make it unique. The computer identification transform has a similar script transform for the target correlation ID that uses the same format. There are several other transforms / scripts that should be changed so just look for any that reference the resource ID and append the DB name.
For rotating through the DBs, I just made it a periodic call and then added an Execute pre-import script to each scheduled data source. I put the logic in the SCCMHelper script include so all scheduled data source uses the same call which is SCCMHelper.ChangeDataSource(data_source);
Below is the script include; not very elegant but I was in a hurry. You only need to set the db name and IP address in the data source each time (and MID server if more than one is used and/or credentials). I used the MID server name in the table and did a lookup so it would work between instances without mods. We set the credentials to match in all DBs so there was no need to update the creds.
SCCMHelper.ChangeDataSource = function(data_source) {
var db_gr = new GlideRecord('u_sccm_db');
var cur_order = 0;
var found_rec = false;
if (db_gr.get('u_db_name', data_source.database_name) == true)
cur_order = db_gr.u_order.toString();
db_gr.initialize();
db_gr.orderBy('u_order');
db_gr.addEncodedQuery('u_active=true');
db_gr.query();
while (db_gr.next()) {
if (db_gr.u_order.toString() == cur_order) {
if (db_gr.next()) { // This is next in order
found_rec = true;
break;
}
}
}
if (found_rec == false) { // last rec so get first
db_gr.initialize();
db_gr.orderBy('u_order');
db_gr.addEncodedQuery('u_active=true');
db_gr.query();
db_gr.next();
}
data_source.database_name = db_gr.u_db_name;
data_source.jdbc_server = db_gr.u_ip_address;
// data_source.connection_url = connection;
m_svrgr = new GlideRecord('ecc_agent');
if (m_svrgr.get('name', db_gr.u_mid_server_name) == true)
data_source.mid_server = m_svrgr.sys_id;
data_source.update();
};
I was transitioning from one DB to 4 so just to be safe I cleared out old SCCM related correlation IDs (actually ID field) from the sys_object_source table. Using this method, the correlation IDs end up the same in both the sys_object_source and cmdb_ci_computer table so all works as normal. Only the SCCM integration knows what these types of correlation IDs are so it does not impact discovery.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-28-2021 05:22 PM
Yes. The problem with Duplicating Schedule Imports, Data Sources, Transform Maps and Field Transforms, is that you end up with hundreds of duplicate records.
A smarter way, is to use the Conditional Script on the Schedule Import (script attached below). Please read the comments in the Script below.
/*
Last Modied: 2021-04-29: Doug Connell
This Script does the following:
A) It checks the SN instance name to see if data should be loaded from this Data Source. THE SN PROD instance should only get data from the NZ PROD SCCM instance. It also won't run on Weekends. Apparently there are other discovery jobs on the weekend that could conflict (not sure).
B) This script then "borrows" or uses the out-of-the-box Data Sources and Scheduled Imports: More specifically, it updates the config for all Data Source records that start with "SCCM 2016..." and where category = "SCCM 2016 Integration"
C) This script then executes the Parent Scheduled Job for these data Sources which is called: "SCCM System 2016 Import". The child records are then executed automatically.
MORE INFORMATION
================
This script copies the Data Source information from this record into all the out-of-the-box (OOTB) data_source records that are provided with the "Integration - Microsoft SCCM 2016" plugin. All of these records have a category value of "SCCM 2016 Integration" and start with a Name of "SCCM 2016 ..."
This configuration (this script) allows the Plugin to be used for more that one SCCM database.
All that is required is a pair of records for each SCCM Database instead of over 100 records for all the Datasources, Transform Maps, and Field trandsformations that we would need to duplicate. It is a much cleaner and easier implemention. The only record pair required for each SCCM Database is:
1) "Schedule Import" record and associated
2) "Data Source" record
*/
// ################################################################
/*FOR TESTING - RUN THIS SCRIPT AS A BACKGROUIND SCRIPT - AND UNCOMMENT:
1) The two lines below starting with: var current = ...
2) The line below starting with: gslog.setLevel("debug");
The SysID below is the sys_id of this Schedule Import.
*/
// var current = new GlideRecord("scheduled_import_set");
// current.get("066fb0771b27e090e95855ba274bcbd6");
// ################################################################
// Set Logging
var
gslog =
new
GSLog(
"ANZ.SCCM_2016.NZNP_Data_Import.debug"
,
"SCCM_2016"
);
gslog.setLevel(
"debug"
);
// A) CHECK IF WE SHOULD RUN
result = ifScript();
if
(result) {
// B) UPDATE ALL THE OOTB DATA SOURCES
updateDataSources();
// C) EXECUTE NOW
executeNow();
}
// D) MAKE SURE THIS SCHEDULE DOES NOT RUN. THE OOTB SCHEDULED JOBS WILL EXECUTE INSTEAD AS PROXIES FOR THIS SCRIPT.
answer =
false
;
// ################################################################
// A) CHECK IF WE SHOULD EXECUTE OR NOT
function
ifScript(){
var
instanceName = gs.getPropert(
"instance_name"
);
if
( instanceName ==
"anztech"
) {
// Do NOT Discover the PROD SCCM instance if we are on NON-PROD ServiceNow Platforms.
var
result =
false
;
}
else
{
// Only Execute this script on Weekdays to
var
now =
new
GlideDateTime();
if
(now.getDayOfWeek() < 6){
result =
true
;
}
else
{
result =
false
;
}
}
gslog.logDebug(
"ifScript result="
+ result);
return
result;
}
// ################################################################
// B) UPDATE THE OOTB DATA SORCES
function
updateDataSources() {
gslog.logDebug(
"current.data_source.mid_server.name: "
+ current.data_source.mid_server.name);
gslog.logDebug(
"current.data_source.mid_server.sys_id: "
+ current.data_source.mid_server.sys_id);
var
sysId = current.data_source.sys_id;
var
gr =
new
GlideRecord(
"sys_data_source"
);
gr.addQuery(
"category=SCCM 2016 Integration^nameSTARTSWITHSCCM"
);
gr.query();
while
(gr.next()) {
gr.setValue(
"mid_server"
,current.data_source.mid_server.sys_id);
gr.setValue(
"format"
,current.data_source.format);
gr.setValue(
"instance_name"
,current.data_source.instance_name);
gr.setValue(
"database_name"
,current.data_source.database_name);
gr.setValue(
"database_port"
,current.data_source.database_port);
gr.setValue(
"use_integrated_authentication"
,current.data_source.use_integrated_authentication);
gr.setValue(
"jdbc_server"
,current.data_source.jdbc_server);
gr.setValue(
"query"
,current.data_source.query);
gslog.logDebug(
"Updating Record: "
+ gr.name);
gr.update();
}
}
// ################################################################
// C) EXECUTE NOW - THE OOTB PARENT SCEDULED DATA LOAD
function
executeNow() {
// B) EXECUTE THE PARENT SCHEDULED JOB
var
job =
new
GlideRecord(
"scheduled_import_set"
);
job.addQuery(
"name=SCCM System 2016 Import"
);
job.query();
if
(job.next()) {
// Trigger Schedule Job to execute
SncTriggerSynchronizer.executeNow(job);
// action.setRedirectURL(job);
gslog.logDebug(
"Triggering the Schedule Job: "
+ job.name);
}
}