- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Tuesday - edited Thursday
How do I configure the GCP Service Graph Connector?
GCP Environment Setup
Version: 1.10
Organization: Deepdiscoveryvms.com
No of Projects in Organization: 1 -> Deepdiscovery
Sample Linux VM to Monitor: linux-e2-vm
Compute Engine API already Enabled(Yes\No): Yes
Cloud Asset API already Enabled(Yes\No): No - Required for Basic Discovery
Cloud Resource Manager API already Enabled(Yes\No): No - Required for Basic Discovery
VM Manager already Enabled(Yes\No): No - Required for Basic Discovery
OS Config API already Enabled(Yes\No): No - Required for Deep Discovery
Note: Automatically enabled when VM Manager is Enabled
Storage API already Enabled(Yes\No): No - Required for Deep Discovery
Established Security Policy allows One Single Service Account to have Read Access to all the Projects in the Organization(Yes\No): Yes
The ServiceNow Service Account will be created in our Deepdiscovery project and will have Read Access to all the Projects in our deepdiscoveryvms Organization. If the Established Security Policy does not allow One Single Service Account to have Read Access to all the Projects in an Organization then Multiple Service Accounts will need to be created with each Service Account having access to a set group of assigned Projects (Please refer to the 1.2 Multiple Service Accounts for the organization section of the Service Graph connector for GCP - Setup Instructions Knowledgebase Article for more details on this Configuration).
Disabled Service Account Key Creation Organization Policy is Inactive(Yes\No): Yes
Note: This Disabled Service Account Key Creation Organization Policy is Active by Default. It needs to be marked as Inactive to allow for Service Account Key Creation.
Service Account associated with VM's has Cloud OS Config Service Agent Role (Yes\No): No- Required for Deep Discovery
A <Project ID>-compute@developer.gserviceaccount.com Service Account, with no Roles associated with it, gets created when the Compute Engine API is Enabled. This is the Service Account that gets associated with newly created VM's by default.
To override this default setting, a Service Account that has the Cloud OS Config Service Agent Role, also referred to as an OS Config Service Account, needs to be explicitly associated with new VM's at the time of Creation (Please refer to this Google Cloud Platform Create a VM that uses a user-managed service account Documentation page for more details).
For VM's that already exist, their Service Account will need to be changed from the <Project ID>-compute@developer.gserviceaccount.com Service Account to the OS Config Service Account (Please refer to this Google Cloud Platform Change the attached service account Documentation page for more details).
Service Account associated with VM's has Full Cloud API Access Scope (Yes\No): No - Required for Deep Discovery
The Service Account associated with the VM's has Cloud API access scopes set to Allow default access by default. Default Cloud API Access Scope has Storage API access set to Read Only. Storage API Access needs to be set to Full in order for the Service Account to be able to write VM output data to the Storage Bucket. The Cloud API access scopes setting needs to be changed from Allow default access to Allow Full access to all Cloud APIs in order for the Service Account to have Storage API=Full Scope Access thus enabling it to write VM output data to the Storage Bucket.
GCP Service Graph Connector Features
Enabling Deep Discovery Data Collection (Yes\No): Yes
ServiceNow Environment Setup
Software Asset Management Enabled(Yes\No): Yes
The following topics are covered in this How do I configure the GCP Service Graph Connector? Article:
A. Installing GCP Service Graph Connector on your ServiceNow Instance
B. Configuring GCP for monitoring your GCP VMs
C. Analyze your VMs in GCP
D. Configuring GCP Service Graph Connector on your ServiceNow Instance
E. Run GCP Service Graph Connector Scheduled Data Import Jobs on your ServiceNow Instance
F. Analyze the CMDB Records created\updated by the GCP Service Graph Connector for your Linux VM in your ServiceNow Instance
G. When to use GCP Service Graph Connector vs Cloud Discovery
A. Installing GCP Service Graph Connector on your ServiceNow Instance
(i) Login to your ServiceNow Instance
(ii) Install the following Application from the ServiceNow Store:
Service Graph Connector for GCP: sn_gcp_integ
The following Applications are automatically installed\activated when you install this application
- Discovery and Service Mapping Patterns: sn_itom_pattern
- Integration Commons for CMDB: sn_cmdb_int_util
- CMDB CI Class Model: sn_cmdb_ci_class
The following Plugins are automatically installed\activated when you install this application
- Discovery Core:com.snc.discovery.core
- Discovery - IP Based: com.snc.discovery.ip_based
- ITOM Discovery License: com.snc.itom.discovery.license (Included with full Discovery Product)
- ITOM Licensing: com.snc.itom.license
- Pattern Designer (NG version):com.snc.ng.pattern.designer
- ServiceNow IntegrationHub Action Template: com.glide.hub.action_type.datastream
(iii) Navigate to Setup under GCP in the Filter Menu and select it to bring you to Guided Setup.
(iv) Navigate to the Setup OS Config Patch Jobs for extended discovery section.
(v) Click Configure to the right of the Download script files step in this section to Download a GCP Commands Zip file to your Desktop.
(vi) Unzip this GCP Commands Zip file to a folder on your Desktop to extract the following Shell Scripts.
The one highlighted in Bold is going to be used in the D. Configuring GCP Service Graph Connector on your ServiceNow Instance Section:
- gcpPowershell.sh - Deep Discovery on Windows VM's
- gcpShell.sh - Deep Discovery on Linux VM's
B. Configuring GCP for monitoring your GCP VMs
The sub sections listed below describe all the Setup that needs to be done in GCP before Configuring and Running the GCP Service Graph Connector:
- ServiceNow Service Account Setup
- Enabling Basic Discovery
- Enabling Deep Discovery
The steps below describe all the Setup that needs to be done in GCP before Configuring and Running the GCP Service Graph Connector.
ServiceNow Service Account Setup
*1. Create SnowOrgRORole Role at the Organization Level
*This step can only be performed by your GCP Administrator.
(i) Log into the GCP Portal using your GCP Account
(ii) Navigate to your Organization, Deepdiscoveryvms.com in our case.
(iii) Navigate to the IAM & Admin\Roles Menu option in the Left Hand Navigator Menu to bring up the Roles for "<Organization Name>" organization Screen, in our case the Roles for "deepdiscoveryvms.com" organization Screen.
(iv) Click on the +Create Role Push button at the top of the screen to bring up the below Create Role Screen:
Title Field: Populate with SnowOrgRORole
ID Field: Populate with SnowOrgRORole
(v) Click on the +Add permissions Pushbutton on this screen to bring up the below Add Permissions screen with the 1st 10 of 12378 OOTB permissions displayed:
(vi) Select the following Permissions from this List:
- resourcemanager.organizations.get
- resourcemanager.folders.list
- resourcemanager.folders.get
- cloudasset.assets.listOSInventories
- cloudasset.assets.exportOSInventories
- cloudasset.assets.listResource
- cloudasset.assets.exportResource
- cloudasset.assets.searchAllResources
- compute.disks.get
- compute.images.list
- compute.machineImages.get
- compute.machineImages.list
- compute.machineTypes.get
- compute.zones.get
(vii) Click on the Add pushbutton to add these Permissions to the SnowOrgRORole Role before being brought back to the Create Role Screen
--> You should see these newly added Permissions displayed in the assigned permissions List on the Create Role Screen
(viii) Click on the Create Pushbutton on the Create Role Screen to create the new SnowOrgRORole Role in your Organization
*2. Create ServiceNow snow-sgc-sa Service Account in a Designated Project
*This step can only be performed by your GCP Administrator
In this step you will be creating the ServiceNow snow-sgc-sa Service Account in one of your projects, in the project that you decide should house this Service Account.
(i) Navigate to the Project that you decide will contain your ServiceNow snow-sgc-sa Service Account, the DeepDiscovery project in our case.
(ii) Navigate to the IAM & Admin\Service Accounts Menu option in the Left Hand Navigator Menu to bring up the Service Accounts for project "<Project Name>" Screen, in our case the Service Accounts for project "DeepDiscovery" Screen.
(iii) Click on the +Create Service Account Pushbutton at the top of this screen to bring up the below Create Service Account Screen:
Service account name Field: Populate with snow-sgc-sa
Service account ID Field: Will be automatically populated with snow-sgc-sa
Email address: Will be automatically populated with snow-sgc-sa@<projectname>.iam.gserviceaccount.com, in our case snow-sgc-sa@deepdiscovery.iam.gserviceaccount.com
(iv) Make a note of the snow-sgc-sa Service Account Email address. You will be providing it in the Configure the Connection and Credentials - Edit Default Connection Guided Setup sub section outlined in the D. Configuring GCP Service Graph Connector on your ServiceNow Instance Section lower down.
(v) Click on the Done pushbutton on the Create Service Account Screen to create the new snow-sgc-sa Service Account
--> You will be brought back to the Service Accounts for project "<Project Name>" Screen where you should see your new snow-sgc-sa Service Account included in the Service Account List on this Screen.
(vi) Confirm that you see the new snow-sgc-sa Service Account included in the Service Account List on this Screen.
*3. Create Key for the ServiceNow snow-sgc-sa Service Account
*This step can only be performed by your GCP Administrator
In this step you will be creating a new Private Key for the snow-sgc-sa Service Account and saving it to a Java Key Store .P12 file. You will be providing this Java Key Store file along with it's associated password in the Configure the Connection and Credentials - Create X.509 certificate Guided Setup sub section outlined in the D. Configuring GCP Service Graph Connector on your ServiceNow Instance Section lower down.
(i) Click on the snow-sgc-sa Service Account displayed in the Service Account List on the Service Accounts for project "<Project Name>" Screen to being up the snow-sgc-sa Service account details Screen shown below:
(ii) Navigate to the Keys Tab on this Screen and select Create new Key from the Add Key Menu pulldown (displayed on the Keys Tab) to bring up the below Create private key for "snow-sgc-sa" screen:
(iii) Select the P12 radio button on this screen and click on Create
- A <projectname>-xxxxxxx.p12 file will be downloaded to your Hard Disk. This is the Java Key Store .P12 file that you will be uploading to your Service Now Instance in the Configure the Connection and Credentials - Create X.509 certificate Guided Setup sub section outlined in the D. Configuring GCP Service Graph Connector on your ServiceNow Instance Section lower down.
- The Private key saved to your computer screen will be displayed
(iv) Make a note of the Private key password displayed on this screen. You will be providing it in the following Guided Setup sub sections outlined in the D. Configuring GCP Service Graph Connector on your ServiceNow Instance Section lower down:
- Configure the Connection and Credentials - Create X.509 certificate
- Configure the Connection and Credentials - Edit Default Connection
*4. Grant ServiceNow snow-sgc-sa Service Account access to your Organization
*This step can only be performed by your GCP Administrator
In this Step you will be granting the ServiceNow snow-sgc-sa Service Account access to your Organization via the SnowOrgRORole Role that you created in the first *1. Create SnowOrgRORole Role at the Organization Level step.
(i) Navigate back to your Organization, in our case Deepdiscoveryvms.com
(ii) Navigate to the IAM & Admin\IAM Menu option in the Left Hand Navigator Menu to bring up the IAM \Allow Permissions for Organization "<Organization Name>" Screen, in our case the IAM \Allow Permissions for Organization "Deepdiscoveryvms.com" Screen.
(iii) Click on the Grant Access Pushbutton on this Screen to bring up the Grant access to "<Organization Name>" Screen. The screen shot below shows the Grant access to "deepdiscoveryvms.com" Screen for our Deepdiscoveryvms.com organization:
New principals Field: Populate with the snow-sgc-sa Service Account that you created in the previous *2. Create ServiceNow snow-sgc-sa Service Account at the Project Level step.
Select a role Field: Populate with the SnowOrgRORole Role that you created in the first *1. Create SnowOrgRORole Role at the Organization Level step.
(iv) Click on the Save pushbutton to grant the snow-sgc-sa Service Account access to your Organization via the SnowOrgRORole Role.
--> You will be brought back to the IAM \Allow Permissions for Organization "<Organization Name>" Screen where you should see your snow-sgc-sa Service Account included in the View by principals List on this Screen with the SnowOrgRORole Role association.
The screen shot below shows the IAM \Allow Permissions for Organization "deepdiscoveryvms.com" Screen for our Deepdiscoveryvms.com organization after the snow-sgc-sa Service Account has been granted access to our Deepdiscoveryvms.com organization via the SnowOrgRORole Role.
Enabling Basic Discovery
The requirements for enabling the collection of Basic Discovery data from your GCP VM's are listed below:
- Cloud Asset API enabled on all projects containing Resources you want discovered
- Cloud Resource Manager API enabled on all projects containing Resources you want discovered
- VM Manager enabled on all projects
- OS Config Agent installed on all VMs you want discovered
5. Enabling Cloud Asset API at the Project Level
In this step you will be enabling the Cloud Asset API(cloudasset.googleapis.com) on all the projects that contain Resources that you want to be discovered by the Service Graph Connector for GCP. The Google Cloud Asset API Service manages the Inventory and history of Google Cloud Resources (Please refer to this Google Cloud Platform Cloud Asset API Documentation page for more details).
The *7. Enable VM Manager step further down explains how this Cloud Asset API is used by the Service Graph Connector for GCP for obtaining Operating System data from VM's in your projects.
(i) Navigate to 1 of the projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the API & Services\Library Menu Option in the Left Hand Navigator menu to bring up the Welcome to API Library Dashboard.
(iii) Enter Cloud Asset API in the Search for APIs & Services Search Field
(iv) Click on the Cloud Asset API Tile that is displayed to bring up the Cloud Asset API Product details screen.
(v) Click on the Enable pushbutton on this Cloud Asset API Product details screen to Enable the Cloud Asset API for the current project.
(vi) Repeat (i) to (v) for all the other projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP.
6. Enabling Cloud Resource Manager API at the Project Level
In this step you will be enabling the Cloud Resource Manager API(cloudresourcemanager.googleapis.com) on all the projects that contain Resources that you want to be discovered by the Service Graph Connector for GCP. The Google Cloud Resource Manager Service Creates, reads, and updates metadata for Google Cloud Platform resource containers (Please refer to this Google Cloud Platform Cloud Resource Manager API Documentation page for more details)
(i) Navigate to 1 of the Projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the API & Services\Library Menu Option in the Left Hand Navigator menu to bring up the Welcome to API Library Dashboard.
(iii) Enter Cloud Resource Manager API in the Search for APIs & Services Search Field
(iv) Click on the Cloud Resource Manager API Tile that is displayed to bring up the Cloud Resource Manager API Product details screen.
(v) Click on the Enable pushbutton on this Cloud Resource Manager API Product details screen to Enable the Cloud Resource Manager API for the current project.
(vi) Repeat (i) to (v) for all the other projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP.
*7. Enabling VM Manager at the Project Level
*This step can only be performed by your GCP Administrator
In this step you will be enabling VM Manager for all the projects that contain Resources that you want to be discovered by the Service Graph Connector for GCP. VM Manager is a Suite of Tools in Google Cloud Platform that GCP Administrators use for managing VM's in your Organization (Please refer to this Google Cloud Platform VM Manager for more details). One of the tools that is included in this Suite is OS Inventory Management which is used for collecting Operating System Data on VM's in a GCP Project via the VM's OS Config Agents.
The Service Graph Connector for GCP accesses this Operating System data for a VM by calling the Cloud Asset API (enabled in the 5. Enable Cloud Asset API step) and then creates the appropriate Operating System Server Record in the CMDB, a Linux Server record in our case for our linux-e2-vm VM.
Enabling VM Manager on a Project triggers the below actions
- Enables the OS Config API on the Project - Required for Deep Discovery
- Activates OS Config Agents that are installed on the VM's in the Project
(i) Navigate to 1 of the Projects in your Organization that contains VM's that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the Compute Engine\VM Manager\Patch Menu Option in the Left Hand Navigator menu to bring up the Patch Screen shown below:
(iii) Click on the Enable Full VM Manager Functionality pushbutton to Enable VM Manager for your project. This triggers the below actions for your project:
- OS Config API is enabled on your Project
- The OS Config Agents installed on the VM's in your Project are activated
(iv) Repeat (i) to (iii) for the other Projects in your Organization that contain VM's you want discovered.
8. Verify that OS Config Agent installed on the VM's in your Projects
In this step you will be verifying that the OS Config Agent is installed on the VM's that you want discovered in your project. The OS Inventory Management VM Manager Suite Tool collects Operating System Data from the VM's in your project via the OS Config Agents installed and running on these VM's as mentioned in the previous 7. Enable VM Manager step.
(i) Navigate to 1 of the Projects in your Organization that contains VM's that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the Compute Engine\VM Instances to bring up the list of VM Instances in your Project
(iii) Click on 1 of the VM Instances in this list to bring up the VM Information Screen for that VM.
(iv) Navigate to the OS Info Tab on this VM Information Screen to display the Operating System data associated with the VM.
(v) Validate that the OS Config agent version field on this Screen is populated. The screen shot below shows how this field is populated for the linux-e2-vm Linux VM in our Deep Discovery Project:
(vi) Repeat steps (iii) to (v) for all the VM's in your Project. If you have a lot of VM's it should be enough to check this for a Sample Set.
(vii) Repeat steps (i) to (vi) for all the Projects in your Organization that contains VM's that you want discovered by the Service Graph Connector for GCP
Enabling Deep Discovery
The collection of Deep Discovery data from your GCP VM's is carried out by a GCP OS Config Patch Job that runs Shell Scripts against your GCP VM's to obtain Deep Discovery data which it then writes to a GCP Storage Bucket. The Deep Discovery data that the GCP OS Config Patch Job collects is listed below:
- Serial Number
- Manufacturer
- Model
- RAM
- CPU Name
- CPU Manufacturer
- CPU Type
- CPU Speed
- CPU Count
- CPU Core Count
- CPU Core Thread
- Running Processes
- TCP Connections
The requirements for enabling the collection of Deep Discovery data from your GCP VM's are listed below:
- OS Config Agent installed and running on the VMs - the OS Config and Cloud Storage API's mentioned lower down are called via the OS Config Agent
Note: The above 8. Verify OS Config Agent installed on the VM's in your Project step will have already verified that OS Config Agents are installed and running on your VMs.
- OS Config API enabled on the GCP Projects containing your GCP VM's - will be called by the OS Config Patch Jobs executed against these VM's
Note: This will have already been enabled as outlined in the above *7. Enable VM Manager step.
- Google Cloud SDK installed on the VMs - required for executing the Shell Script commands on the VM's
- OS Config Service Account (has Cloud OS Config Service Agent Role) - required for allowing your VM's OS Config Agents to authenticate with the OS Config API and Cloud Storage API Endpoints
- OS Config Service Account associated with your VM's with Cloud API access scopes set to Allow full access to all Cloud APIs - required for granting the OS Config Service Account permission for calling the Cloud Storage API Endpoints
- Storage Bucket - required for storing the Deep Discovery output data returned by the Shell Scripts executed on your VM's
- SnowOrgStorageReadWrite Role - required for allowing the OS Config Service Account access to the Storage Bucket
- Cloud Storage API enabled on the project that contains the Storage Bucket - will be called by the OS Config Patch Jobs executed against these VM's for writing to and reading from the Bucket
- Patch Job Executor Role added to the list of Permissions that your ServiceNow snow-sgc-sa Service Account has been granted for accessing your Organization - required for allowing the ServiceNow snow-sgc-sa Service Account to execute the GCP OS Config Patch Jobs against the VM's in your Organization
9. Verify that OS Config API is enabled on your project
In this step you will be verifying that the OS Config API(osconfig.googleapis.com) is enabled on the projects containing Resources that you want discovered. The Google OS Config API Service encapsulates OS management tools that are used for patch management, patch compliance, and configuration management on VM instances (Please refer to this Google Cloud Platform OS Config API Documentation page for more details).
(i) Navigate to 1 of the projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the API & Services\Enabled APIs & Services Menu Option in the Left Hand Navigator menu to bring up the API & Services Dashboard for your Project.
(iii) Search for OS Config API in the APIs & Services List displayed on this Dashboard
- The OS Config API should be displayed in the List showing Request Metadata associated with recent calls to the API
The screen shot below shows the OS Config API included in our DeepDiscovery project's APIs & Services List:
(iv) Repeat (i) to (iii) for all the other projects in your Organization that contain Resources that you want discovered by the Service Graph Connector for GCP.
10. Verify that Google Cloud SDK installed on the VM's in your Project
In this step you will be verifying that the Google Cloud SDK is installed on the VM's that you want discovered in your project.
(i) Navigate to 1 of the Projects in your Organization that contains VM's that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the Compute Engine\VM Instances Left Hand Navigator Menu Option to bring up the list of VM Instances in your Project
Linux VMs
(iii) Navigate to 1 of the Linux VM Instances in this list and select the Open in Browser Window menu option from the VM's SSH Menu pulldown(displayed to the right of the VM)
(iv) Click on the Authorize pushbutton shown on the Authorize Dialog box that is displayed to bring up a SSH Shell Window for that VM
Note: If you can't access the SSH Shell Window through the Browser you may need to access the VM through SSH or some other means to run Shell Commands.
(v) Enter the gcloud --version command in the SSH Shell Window.
(vi) Validate that the Version information returned from this command includes a Google Cloud SDK version. The screen shot below shows the output of this command for our linux-e2-vm Linux VM:
Windows VMs
(vii) Navigate to 1 of the Windows VM Instances in this list and select the Download the RDP File option from the VM's RDP Menu pulldown(displayed to the right of the VM) to download this RDP File to your Desktop.
(viii) Initiate a Remote Desktop Connection to the VM by double clicking on the downloaded RDP File
(ix) Provide the necessary credentials for connecting to this VM. You may need to set a Windows Password for the VM in Google Cloud Platform (By selecting the Set Windows Password option from the VM's RDP Menu pulldown)
(x) Navigate to the Google Cloud Shell Icon shown on the Windows VM Desktop and open it
(xi) Enter the gcloud --version command in the Google Cloud Shell Shell Window
(xii) Validate that the Version information returned from this command includes a Google Cloud SDK version. The screen shot below shows the output of this command for 1 of our Windows VMs:
(xiii) Repeat steps (iii) to (xii) for all the VM's in your Project. If you have a lot of VM's it should be enough to check this for a Sample Set.
(xiv) Repeat steps (i) to (xiii) for all the Projects in your Organization that contains VM's that you want discovered by the Service Graph Connector for GCP.
*11. Create an OS Config Service Account in your Project
*This step can only be performed by your GCP Administrator
In this step you will be creating an OS Config Service Account in your project and associating the Cloud OS Service Agent Role with it. The OS Config Service Account will be created in the DeepDiscovery project in our case and will be associated with our linux-e2-vm VM in a later step.
(i) Navigate to 1 of the projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the IAM & Admin\Service Accounts Menu option in the Left Hand Navigator Menu to bring up the Service Accounts for project "<Project Name>" Screen, in our case the Service Accounts for project "DeepDiscovery" Screen.
(iii) Click on the +Create Service Account Pushbutton at the top of this screen to bring up the below Create Service Account Screen:
Service account name Field: Populate with e.g. service-osconfig
Service account ID Field: Will be automatically populated with service-osconfig
Email address: Will be automatically populated with service-osconfig@<projectname>.iam.gserviceaccount.com, in our case service-osconfig@deepdiscovery.iam.gserviceaccount.com
(iv) Click on the Create and continue pushbutton to be brought to the below Permissions (optional) section of the Create Service Account screen:
(v) Click on the Select a role Field Pulldown in this section to bring up the below Filter by Role or permission Screen:
(vi) Search for the Cloud OS Config Service Agent Role in this Screen and select it to populate the Select a role field on the Create Service Account Screen (you are brought back to the Create Service Account Screen after the Role is selected).
(vii) Click on the Done pushbutton on the Create Service Account Screen to create the new service-osconfig Service Account
--> You will be brought back to the Service Accounts for project "<Project Name>" Screen where you should see your new service-osconfig Service Account included in the Service Account List on this Screen.
The screen shot below shows the list of Service Accounts in our DeepDiscovery project that include the snow-sgc-sa Service Account(created in step 2) and our newly created service-osconfig Service Account.
(viii) Repeat (i) to (vii) for all the other projects in your Organization that contain Resources that you want discovered by the Service Graph Connector for GCP.
12. Changing Service Account and Cloud API access scopes setting for your VM's
In this step you will be
- Changing the Service Account associated with your already existing VM's from the default <Project ID>-compute@developer.gserviceaccount.com Service Account to the OS Config Service Account that you created in the previous *11. Create an OS Config Service Account in your Project step.
- Changing the Cloud API access scopes setting for this OS Config Service Account from the default Allow default access setting to the Allow full access to all Cloud APIs setting.
- This is required for granting the OS Config Service Account permission for calling the Cloud Storage API Endpoints which it needs to call in order to write VM output data to the Storage Bucket.
(i) Navigate to 1 of the projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the Compute Engine\VM Instances to bring up the list of VM Instances in your Project
(iii) Select 1 of the VM Instances in this list and click on the Stop Action Button at the top of the Screen to stop the VM (The VM needs to be stopped before you can change any of it's settings)
(iv) After the VM is successfully stopped, click into it to bring up the VM Information Screen for that VM
(v) Click on the Edit Action Button shown at the top of this VM Information Screen to make the settings on this screen Editable
(vi) Navigate down to the Security and access\Identity and API access Section of the screen
(vii) Change the Active Selection in the Service Account Menu Pulldown from Compute Engine default service account to your OS Config Service Account
(viii) Change the Active Selection in the Access Scopes set of Radio Buttons from Allow default access to Allow full access to all Cloud APIs
(ix) Click on the Save Pushbutton on this Screen to save these Setting Changes.
(x) Click on the Start Action Button at the top of the Screen to Restart the VM with the new API and Identify Management Settings
The screen shot below shows these changed API and Identify Management Settings for our linux-e2-vm VM in our DeepDiscovery Project
(xi) Repeat steps (iii) to (x) for all the VM's in your Project
(xii) Repeat steps (i) to (xi) for all the Projects in your Organization that contains VM's that you want discovered by the Service Graph Connector for GCP
Note: When deploying new VM's most deployment tools like Terraform Deployment scripts allow for explicitly specifying the Service Account and with what Cloud API access scopes setting to use i.e. OS Config Service Account with Full access to all Cloud APIs.
Note: Changing the Service Account and Cloud API access scopes setting for your already existing VM's can also be achieved via a Bash Script Configuration File containing the necessary Google Cloud Commands for achieving the same thing which could be executed via a Google Cloud Build Trigger (Please refer to the Google Cloud Platform Create and manage build triggers Documentation page for more information on Cloud Build Triggers).
13. Create a Storage Bucket in a Designated Project in your Organization
In this step you will be creating a Storage Bucket in a Designated Project in your Organization with a scripts Folder for containing the gcpPowershell.sh and gcpShell.sh Shell Scripts referenced in the above Section A. Installing GCP Service Graph Connector on your ServiceNow Instance Section and a vm-outputs Folder for containing the Output of running these Shell Scripts against your VM's.
(i) Navigate to the Project that you decide will contain your Storage Bucket, the DeepDiscovery project in our case.
(ii) Navigate to the Cloud Storage\Buckets Menu Option in the Left Hand Navigator menu to bring up the list of Storage Buckets in your project
(iii) Click on the +Create pushbutton displayed at the top of this list to bring up the Create a Bucket Screen
(iv) Enter a Unique Bucket Name in the field displayed in the Get Started section of this screen, e.g. deepdiscovery for our Storage Bucket
(v) Click on the Create pushbutton displayed at the bottom of the this screen
(vi) Click Confirm on the Public access will be prevented Dialog that is displayed to create the new Storage Bucket
-> You will be brought to the Objects Tab in the Bucket Details screen of the newly created Bucket
(vii) Navigate to the Right hand Folder pane of this Objects Tab and click on the Create Folder Action displayed at the top of this pane
(viii) Enter a Scripts Folder name like e.g. scripts in the Create Folder Dialog box that is displayed and click on the Create Pushbutton on this Dialog box to create the new Scripts Folder
(ix) You should see the new Scripts Folder being displayed in the Left Hand Folder Browser of the Objects Tab
(x) Repeat steps (vii) to (ix) in order to create a VM Outputs Folder for containing the output of running the shell scripts against your VM's, vm-outputs in our case for containing the output of running the shell scripts against our VM's.
The screen shot below shows our deepdiscovery bucket with our scripts and vm-outputs folders being displayed in the Folder Browser of it's Objects Tab
14. Upload Scripts to Storage Bucket
In this step you will be modifying the BUCKET_NAME and FOLDER_PATH Parameters in the Script files that you downloaded in the previous A. Installing GCP Service Graph Connector on your ServiceNow Instance section to contain below:
- BUCKET_NAME - Name of Storage Bucket that you created in the above 13. Create a Storage Bucket in a Designated Project in your Organization step
- FOLDER_PATH - Name of VM Outputs Folder that you created in the same above step
You will then be uploading these modified scripts to your Storage Bucket, e.g. deepdiscovery in our case, and making note of the Generation Number associated with each of these uploaded script files (Will be providing as part of Guided Setup further down)
(i) Update the BUCKET_NAME and FOLDER_PATH Parameters in the gcpPowershell and gcpShell script files you downloaded earlier to contain your Storage Bucket Name and VM Outputs Folder Name respectively. The screen shot below shows the BUCKET_NAME and FOLDER_PATH Parameters being updated for our deepdiscovery Storage Bucket.
(ii) Navigate to the Scripts Folder in the Folder Browser of your Storage Bucket e.g. the scripts folder in our case
(iii) Select the Upload Files Menu option from the Upload Action Menu displayed at the top of the Right hand Folder Pane
(iv) Select the gcpPowershell.sh and gcpShell.sh Shell Script files that you updated in (i)
--> You should see these 2 files being displayed in the Right hand Folder pane.
The screen shot below shows these 2 files being displayed in our scripts Folder pane
(v) Open a Google Cloud Console Window
(vi) Run the following Google Cloud Command in order to obtain the Generation Number associated with your gcpShell.sh Script File:
gsutil stat gs://<Project Name>/scripts/gcpShell.sh
The below screen shot shows the output of running the gsutil stat gs://deepdiscovery/scripts/gcpShell.sh command for the gcpShell Script file in our deepdiscovery project
(vii) Make note of the gcpShell.sh Generation Number produced in the Command output for use in Guided Setup further down.
(viii) Run the following Google Cloud Command in order to obtain the Generation Number associated with your gcpPowershell.sh Script File:
gsutil stat gs://<Project Name>/scripts/gcpPowershell.sh
(ix) Make note of the gcpPowershell.sh Generation Number produced in the Command output for use in Guided Setup further down.
*15. Create SnowOrgStorageReadWrite Role at the Organization Level
*This step can only be performed by your GCP Administrator.
In this step you will be creating a SnowOrgStorageReadWrite Role that will be used for granting access your ServiceNow and OS Config Service Accounts access to your Storage Bucket
(i) Navigate to your Organization, Deepdiscoveryvms.com in our case.
(ii) Navigate to the IAM & Admin\Roles Menu option in the Left Hand Navigator Menu to bring up the Roles for "<Organization Name>" organization Screen, in our case the Roles for "deepdiscoveryvms.com" organization Screen.
(iii) Click on the +Create Role Push button at the top of the screen to bring up the below Create Role Screen:
Title Field: Populate with SnowOrgStorageReadWrite
ID Field: Populate with SnowOrgStorageReadWrite
(v) Click on the +Add permissions Pushbutton on this screen to bring up the below Add Permissions screen with the 1st 10 of 12378 OOTB permissions displayed:
(vi) Select the following Permissions from this List:
- storage.objects.list
- storage.objects.get
- storage.objects.create
- storage.objects.delete
(vii) Click on the Add pushbutton to add these Permissions to the SnowOrgReadWriteRole Role before being brought back to the Create Role Screen
--> You should see these newly added Permissions displayed in the assigned permissions List on the Create Role Screen
(viii) Click on the Create Pushbutton on the Create Role Screen to create the new SnowOrgReadWriteRole Role in your Organization
16. Grant the OS Config Service Account access to the Storage Bucket via the SnowOrgStorageReadWrite Role
In this step you will be granting the OS Config Service Account access to the Storage Bucket via the SnowOrgStorageReadWrite Role. In our case the service-osconfig Service Account will be granted access to our deepdiscovery Storage Bucket via the SnowOrgStorageReadWrite Role.
(i) Navigate to the Project that contains your Storage Bucket, the DeepDiscovery project in our case.
(ii) Navigate to the Cloud Storage\Buckets Menu Option in the Left Hand Navigator menu to bring up the list of Storage Buckets in your project
(iii) Click on the Storage Bucket that you created in the above 13. Create a Storage Bucket in a Designated Project in your Organization step to bring up it's Bucket Details Screen
(iv) Navigate to the Permissions Tab of this Bucket Details Screen
(v) Scroll down to the View by Principals List on this Permissions screen and click on the +Grant access Action at the top of this list to bring up the Grant access to "<Bucket Name>" screen, the Grant access to "deepdiscovery" screen in our case. The screen shot below shows the Grant access to "deepdiscovery" screen for our deepdiscovery bucket:
New principals Field: Populate with the service-osconfig Service Account that you created in the previous *11. Create an OS Config Service Account in your Project step.
Select a role Field: Populate with the SnowOrgReadWriteRole Role that you created in the above *15. Create SnowOrgStorageReadWrite Role at the Organization Level step.
(vi) Click on the Save pushbutton to grant the service-osconfig Service Account access to your Bucket via the SnowOrgReadWriteRole Role.
--> You will be brought back to Permissions Tab on the <"Bucket Name"> Bucket Details Screen where you should see your service-osconfig Service Account included in the View by Principals List on this Permissions Tab with the SnowOrgRORole Role association.
The screen shot below shows the View by Principals List in the Permissions Tab of our deepdiscovery Bucket after the service-osconfig Service Account has been granted access to the Bucket via the SnowStorageReadWrite Role.
Note: If the Project that you created your Storage Bucket in, is a different Project to the Project that you created your ServiceNow snow-sgc-sa Service Account in, you will also need to Grant the ServiceNow snow-sgc-sa Service Account access to your Storage Bucket via the SnowOrgStorageReadWrite Role. In order words repeat (iii) to (vi) for your ServiceNow snow-sgc-sa Service Account.
17. Enabling Cloud Storage API at the Project Level
In this step you will be enabling the Cloud Storage API(storage.googleapis.com) on all the projects that contain Resources that you want to be discovered by the Service Graph Connector for GCP. The Google Cloud Storage API is a JSON-backed interface for accessing and manipulating Cloud Storage projects(Please refer to this Google Cloud Platform Cloud Storage JSON API overview Documentation page for more details)
(i) Navigate to 1 of the Projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(ii) Navigate to the API & Services\Library Menu Option in the Left Hand Navigator menu to bring up the Welcome to API Library Dashboard.
(iii) Enter Cloud Storage API in the Search for APIs & Services Search Field
(iv) Click on the Cloud Storage API Tile that is displayed to bring up the Cloud Storage API Product details screen.
(v) Click on the Enable pushbutton on this Cloud Storage API Product details screen to Enable the Cloud Storage API for the current project.
(vi) Repeat (i) to (v) for all the other projects in your Organization that contains Resources that you want discovered by the Service Graph Connector for GCP.
*18. Add Patch Job Executor Role to the list of Permissions the ServiceNow snow-sgc-sa Service Account has been granted for accessing your Organization
*This step can only be performed by your GCP Administrator.
In this step you will be adding the Patch Job Executor Role to the already existing list of Permissions (the SnowOrgRORole Role) that your ServiceNow snow-sgc-sa Service Account was granted in the previous *4. Grant ServiceNow snow-sgc-sa Service Account access to your Organization step. The ServiceNow snow-sgc-sa Service Account requires Patch Job Executor Role access to your Organization for executing the GCP OS Config Patch Jobs against the VM's in your Organization.
(i) Navigate back to your Organization, in our case Deepdiscoveryvms.com
(ii) Navigate to the IAM & Admin\IAM Menu option in the Left Hand Navigator Menu to bring up the IAM \Allow Permissions for Organization "<Organization Name>" Screen, in our case the IAM \Allow Permissions for Organization "Deepdiscoveryvms.com" Screen.
(iii) Navigate down to the already existing snow-sgc-sa Service Account in the View by principals List on this Screen.
(iv) Click on the Edit Principal Icon to the right of the snow-sgc-sa Service Account to being up the Edit access to "<Organization>" Screenshot. The screen shot below shows the Edit access to "deepdiscoveryvms.com" Screen for our Deepdiscoveryvms.com organization:
Note: Notice the SnowOrgRORole that was granted to the snow-sgc-sa Service Account in the previous *4. Grant ServiceNow snow-sgc-sa Service Account access to your Organization Step.
(v) Click on the + Add another role Action button on this screen to bring up the Select a role Menu Pulldown.
(vi) Search for the Patch Job Executor Role in this Select a role Menu Filter and select it
-> The Patch Job Executor Role is added to the Edit access to "<Organization>" Screen. The Screen shot below shows this role being added to our Edit access to "deepdiscoveryvms.com" Screen:
(vii) Click on the Save Pushbutton on this Screen to add the Patch Job Executor Role to the already existing list of Permissions that the snow-sgc-sa Service Account has for accessing your Organization.
The screen shot below shows the IAM \Allow Permissions for Organization "deepdiscoveryvms.com" Screen for our Deepdiscoveryvms.com organization after the Patch Job Executor Role has been added to the already existing list of Permissions (SnowOrgRORole Role) that the snow-sgc-sa Service Account has for accessing your Organization.
C. Analyzing your VMs in Google Cloud Platform(GCP)
The data associated with your Linux VM is provided by the Google Cloud Platform Compute Engine Module within the Google Cloud Platform Portal.
(i) Log into the GCP Portal using your GCP Account
(ii) Navigate to your Organization, Deepdiscoveryvms.com in our case
(iii) Navigate to 1 of the Projects in your Organization that contains VM's that you want discovered by the Service Graph Connector for GCP, the DeepDiscovery project in our case.
(iv) Navigate to the Compute Engine\VM Instances Left Hand Navigator Menu Option to bring up the list of VM Instances in your Project. The below screen shot shows the list of VM's in our DeepDiscovery Project. The list shows our single linux-e2-vm VM.
(iv) Click on any of the Virtual Machines in this list to bring up the Virtual Machine Screen associated with that Virtual Machine. This Screen has the Tabs listed below:
- Details - Contains Basic information, Network interface, Storage Sections
- OS Info - Contains Operating System and Installed Packages Sections
Details
The screenshot below shows the Virtual Machine Screen associated with our linux-e2-vm VM with the Details Tab being displayed by default. The Basic Information section is the first section shown on this Details Tab as can be seen below which contains basic details associated with the VM like Name, Instance ID, Location, Source Image etc.
Note: It also contains any Labels or Tags associated with the VM. For example the above screenshot shows a goog ops...:v2x86-template Label associated with our linux-e2-vm VM.
Network interfaces
(v) Scroll down to the Network interfaces Section of the Details Tab to to see the list of Network Interface Cards associated with your VM. The below screenshot shows the nic0 Network Interface card for our linux-e2-vm Virtual Machine along with it's associated Network, Private IP Address and Public IP Address.
Storage
(vi) Scroll down to the Storage Section of the Details Tab to see the Disks associated with your VM. The below screenshot shows that our linux-e2-vm Virtual Machine was provisioned with a Disk named linux-e2-vm (same name as the VM).
OS Info
The OS Info Tab of the VM Screen has the sections listed below:
- Basic info
- Installed Packages
Basic info
(vii) Navigate to the OS Info Tab to bring up the OS Info Screen associated with the VM. The screen shot below shows the OS Info Tab with the Basic info section being shown for our linux-e2-vm VM which shows details like Operating system, OS Version and OS Config agent version.
Installed Packages
(viii) Scroll down to the Installed Packages section of the OS Info Screen to see the list of Installed Packages on your VM. The screen shot below shows the list of Installed Packages on our linux-e2-vm VM, 364 Installed Packages in total.
D. Configuring GCP Service Graph Connector on your ServiceNow Instance
(i) Login to your ServiceNow Instance
(ii) Navigate to Setup under GCP in the Filter Menu
(iii) Go through the remaining Guided Setup steps as per the ServiceNow Documentation: Configure Service Graph Connector for GCP using the guided setup (The Setup OS Config Patch Jobs for extended discovery - Download script files sub section was covered in step (v) of the A. Installing GCP Service Graph Connector Section above).
Configure the Connection and Credentials
Your ServiceNow Instance will be authenticating against your Google Cloud Platform(GCP) Account using an OAuth Token. You will be providing GCP OAuth Credential Details in the Edit Default Connection sub section of this Configure the Connection and Credentials Guided Setup Step.
Create X.509 certificate
In this Configure the Connection and Credentials sub section you will be creating a ServiceNow X.509 Certificate Record that will store the GCP Java Key Store .P12 file (contains the Private Key associated with the snow-sgc-sa Service Account) that you created in the 3. *Create Key for the ServiceNow Service Account step of the above B. Configuring GCP for monitoring your GCP VMs Section.
(i) Click on the Configure Pushbutton to the right of the Create X.509 certificate sub section to being up the below X.509 Certificate New record Form:
(ii) Populate the fields on this Form with the below values:
Name: Unique name for your X.509 Certificate record. We provided SG-GCP-509Certificate-deepdiscovery as our X.509 Certificate record name
Type: Java Key Store Menu option (already prepopulated)
Key store password: GCP Java Key Store Password that you made note of in the 3. *Create Key for the ServiceNow Service Account step of the above B. Configuring GCP for monitoring your GCP VMs Section.
(iii) Click on the Attachment icon displayed at the top of the Form and select the GCP Java Key Store .P12 File that you downloaded to your Hard Disk in the 3. *Create Key for the ServiceNow Service Account step of the above B. Configuring GCP for monitoring your GCP VMs Section.
(iv) Click on the Validate Stores/Certificates Related Link to validate your Java Key Store File
--> You should see a Valid key_store Information message like the one below that we got when validating our Java Key Store File
(v) Click on the Update Pushbutton on this Form to save your X.509 Certificate record, SG-GCP-509Certificate-deepdiscovery in our case
--> You will be brought back to the Configure the Connection and Credentials Guided Setup Step
Edit Default Connection
(i) Click on the Configure Pushbutton to the right of the Edit Default Connection sub section to bring up the SG-GCP Default Connection Tile in Workflow Studio. The below screenshot shows the SG-GCP Default Connection Tile screen that you should expect to be brought to in Workflow Studio.
Note: If clicking on the Configure pushbutton brings you to the Workflow Studio Homepage instead of bringing you directly to the SG-GCP Default Connection then navigate to the Integrations Tab and click View Details on the SG-GCP Default Connection (Parent Connection & Credential Alias) Connection Tile.
(ii) Click on Edit on the SG-GCP Default Connection Connection to bring up the below Dialog Box:
(iii) Populate the fields on the Dialog Box as below:
Service Account Email: snow-sgc-sa@<projectname>.iam.gserviceaccount.com email address associated with your snow-sgc-sa ServiceNow Service Account. This is the email address that you made note of in the above *2. Create ServiceNow snow-sgc-sa Service Account in a Designated Project step. In our case we had the snow-sgc-sa@deepdiscovery.iam.gserviceaccount.com email address.
Keystore: The X.509 Certificate record that you created in the previous Create X.509 certificate sub section, in our case the SG-GCP-509Certificate-deepdiscovery X.509 Certificate record.
Keystore Password: The password associated with the GCP Java Key Store .P12 file that you created and made note of in the above 3. *Create Key for the ServiceNow snow-sgc-sa Service Account step
Organization Id: Organization ID associated with your Organization (You can get this by navigating to your Organization in Google Cloud Platform, selecting the IAM & Admin\Organizations option from the Left Hand Navigator Menu and making note of the Organization ID on the Organization details Screen displayed for your Organization)
Discovery Scope: organizations. Your snow-sgc-sa ServiceNow Service Account has Read Access to all the Projects in your Organization
Note: For scenarios where the snow-sgc-sa ServiceNow Service Account has Read Access to only a subset of the Projects in the Organization instead of having Read Access to all Projects, "projects" is specified for this Discovery Scope field instead of "organizations" (Please refer to the 1.2 Multiple Service Accounts for the organization section of the Service Graph connector for GCP - Setup Instructions Knowledgebase Article for more details on this Scenario).
(iv) Click on the Save and Get OAuth pushbutton to save the SG-GCP Default Connection Connection details, generate the OAuth Token and go back to the original SG-GCP Default Connection screen. The following happens:
- The already existing SG-GCP Default Connection.Credential Credentials Record (associated with the Parent SG-GCP Default Connection Connection & Credential Alias Record) is updated with the generated OAuth Token.
- A SG-GCP Default Connection Service Graph Connection Record is updated with the above Field values. This SG-GCP Default Connection Service Graph Connection Record will be shown in the next Test the Connection sub section.
(v) Click on the View Connection Alias pushbutton on the SG-GCP Default Connection Screen to view the Parent SG-GCP Default Connection Connection & Credential Alias Record. The screen shot below shows the Parent SG-GCP Default Connection Connection & Credential Alias Record. Notice the SG-GCP Default Connection.Credential Credentials Record listed in the Connections Tab of this Parent SG-GCP Default Connection Connection & Credential Alias Record.
Note: Any Child Connection & Credential Aliases that may be created when you click on Add Connection from the SG-GCP Default Connection Screen for connecting to a different Organization in your Google Cloud Platform(GCP) Account will be associated with this Parent Connection & Credential Alias and shown in the Child Aliases Tab of this Record.
Test the connection
All Service Graph Connectors have at least 1 Service Graph Connection Record associated with them that encapsulates the Properties (if any), Data Sources and Scheduled Data Import Jobs associated with the Service Graph Connector Connection that is connecting to the Back End System URL in question. The Connection Record has a Test Connection Link that is used for Testing the connection from the Service Graph Connector to the Back End System URL that it is connecting to. These Service Graph Connector Connection Records are stored in the Service Graph Connections[sn_cmdb_int_util_service_graph_connection] Table.
(i) Return to Guided Setup and Click on Configure to the right of Test the Connection to bring up the SG-GCP Default Connection Connection Record shown in the below screen shot:
Notice the Discovery Scope, Service Account and Organization Id properties that you provided for the connection in (iii) of the previous Edit Default Connection sub section. The JWT Provider property encapsulates the X.509 Certificate and associated Java Key Store password that you provided in this previous sub section.
(ii) Click on the Test Connection Related link to Test the Connection successfully connects to the Google Cloud Platform(GCP) Backend.
If the Connection is successful you will see a Success Information message at the top of the SG-GCP Default Connection Service Graph Connections Screen and the Status Field associated with the Connection will change from Pending to Success as shown in the below screen shot.
Configure the Scheduled Imports
GCP Service Graph Connector Scheduled Import Jobs will be run at the interval you specify to ingest data from your Google Cloud Platform(GCP) Account. The CMDB database on your ServiceNow Instance will be populated with this ingested data. The GCP Service Graph Connector comes with 36 Out of the Box Data Sources and Scheduled Data Imports
(i) Return to Guided Setup and Click on the Configure button to the right of Configure the Scheduled Imports to bring up the GCP Scheduled Import Jobs shown below.
They are shown in the Order that they run in (You need to Personalize your List Columns to include the Order column). Please refer to the ServiceNow Service Graph Connector for GCP Documentation Page for details on these Scheduled Import Jobs.
(ii) Mark the Parent SG-GCP-Organization job as Active by changing the Active Field for this job from false to true
(iii) The other jobs are set to run After Parent Runs, after the Parent GCP-Organization job
(iv) The SG-GCP-Organization job is set to run Daily by Default. Change this to run to a different Schedule like Weekly, Monthly, Periodically if you wish.
Setup OS Config Patch Jobs for extended discovery
In this Setup OS Config Patch Jobs for extended discovery Guided Setup step you will be updating the Deep Discovery related Connection Properties associated with your SG-GCP Default Connection Connection Record.
Configure connection properties
(i) Return to Guided Setup and Click on Configure to the right of Configure connection properties to bring up the below SG-GCP Configuration Properties screen:
(ii) Populate the fields on this screen as below:
Connection: Populate with the SG-GCP Default Connection Alias Record that you updated in the previous Edit Default Connection subsection.
Name of the Cloud Storage Bucket: Name of the Storage Bucket that you created in the 13. Create a Storage Bucket in a Designated Project in your Organization step of the above B. Configuring GCP for monitoring your GCP VMs Section, deepdiscovery in our case for our deepdiscovery Bucket.
Path to the .sh file uploaded to the Storage Bucket: Path to the gcpShell.sh Script File that you uploaded to the Scripts Folder of your Storage Bucket in (iv) of the 14. Upload Scripts to Storage Bucket step of the above B. Configuring GCP for monitoring your GCP VMs Section, scripts/gcpShell.sh in our case for the gcpShell.sh Script file that we uploaded to the scripts Scripts Folder in our deepdiscovery Bucket.
Generation number of the .sh file uploaded to Cloud Storage Bucket: The GCP Generation Number for the Uploaded gcpShell.sh Script File that you made note of in (vi) of the 14. Upload Scripts to Storage Bucket step of the above B. Configuring GCP for monitoring your GCP VMs Section.
Path to the .ps1 file uploaded to the Storage Bucket: Path to the gcpShell.ps1 Script File that you uploaded to the Scripts Folder of your Storage Bucket in (iv) of the 14. Upload Scripts to Storage Bucket step of the above B. Configuring GCP for monitoring your GCP VMs Section, scripts/gcpShell.ps1 in our case for the gcpShell.ps1 Script file that we uploaded to the scripts Scripts Folder in our deepdiscovery Bucket.
Generation number of the .ps1 file uploaded to Cloud Storage Bucket: The GCP Generation Number for the Uploaded gcpShell.ps1 Script File that you made note of in (ix) of the 14. Upload Scripts to Storage Bucket step of the above B. Configuring GCP for monitoring your GCP VMs Section.
Name of the Cloud Storage Bucket provided in .sh and .ps1 files where the outputs of the commands have to be stored: Name of the Storage Bucket that you created in the 13. Create a Storage Bucket in a Designated Project in your Organization step of the above B. Configuring GCP for monitoring your GCP VMs Section, deepdiscovery in our case for our deepdiscovery Bucket (We are using the same Storage Bucket for the Uploaded Script Files and the Outputs from those Scripts)
Folder path provided in .sh and .ps1 files where the outputs are uploaded in the Storage Bucket: Path to the Outputs Folder that you created in your Storage Bucket in (x) of the 14. Upload Scripts to Storage Bucket step of the above B. Configuring GCP for monitoring your GCP VMs Section, vm-outputs in our case for the vm-outputs Outputs Folder that we created in our deepdiscovery Bucket.
(iii) Click on the Save Pushbutton to save the values of these fields to their respective Connection Properties in the SG-GCP Default Connection Connection Record.
The screen shot below shows these Connection Properties being updated with their respective Field values for our SG-GCP Default Connection Connection Record.
Note: For any Connection specific Connection Records that you create in the next Add Multiple Instances section, you will be selecting that Child Connection Alias record in the above Connection Field and updating the above Fields on this Dialog Box for that Connection specific Connection Record.
Add Multiple Instances
There is an Add Multiple Instances step in Guided Setup that is not Mandatory but is recommended even if you are only using One GCP Organization. It allows you to create a set of Data Sources and Scheduled Imports that are specific to your Customer Specific GCP Organization. This is recommended for the following reasons:
- It is good futureproofing for cases where you may need to connect to a 2nd GCP Organization sometime in the future. For example a GCP Organization in a different company that is acquired through corporate M&A activity or for a case when you want to connect to GCP Gov Cloud vs GCP Commercial.
- It prepares you for future upgrades, where the Customer specific Data Source and Scheduled Data Import Records in the sys_data_source Table will not be marked as Skipped Records for Review by the Upgrade. It will allow you to focus on Skipped Records due to intentional Customization as oppose to Execution of the Out of the Box Scheduled Imports.
Note: It is also the mechanism by which you would be implementing the 1.2 Multiple Service Accounts for the organization Scenario referenced in the Service Graph connector for GCP - Setup Instructions Knowledgebase Article where each separate Service Account would have access to it's own subset Projects in the Organization. A new Instance Connection Record with it's associated Customer specific Data Sources and Scheduled Imports will need to be created for each separate Service Account.
Go through all the steps in this Add Multiple Instances Guided Setup Step to specify Customer specific Data Sources and Scheduled Imports. Pay particular attention to the below sub sections in this step:
Create X.509 certificate
(i) Click on the Configure pushbutton to the right of Create X.509 certificate to bring up the X.509 Certificate New record Form
(ii) Populate the Name field with a unique X.509 Certificate Name specific to the Organization that you are connecting to e.g. USAGOV-SG-GCP-509Certificate
(iii) Attach the GCP Java Key Store File associated with the Service Account that you created for the Organization that you are connecting to
(iv) Populate the Key store password field with the Password associated the GCP Java Key Store File
(v) Click on the Validate Stores/Certificates Related Link to validate your Java Key Store File
--> You should see a Valid key_store Information message
(vi) Click on the Update Pushbutton on this Form to save your e.g. USAGOV- SG-GCP-509Certificate X.509 Certificate record,
--> You will be brought back to the Add Multiple Instances Guided Setup Step
Add New Connection
(i) Click on the Configure button to the right of the Add New Connection sub section to bring up the SG-GCP Default Connection Tile in Workflow Studio.
Note: If clicking on the Configure pushbutton brings you to the Workflow Studio Homepage instead of bringing you directly to the SG-GCP Default Connection Tile then navigate to the Integrations Tab and click View Details on the SG-GCP Default Connection (Parent Connection & Credential Alias) Connection Tile.
(ii) Click on the Add Connection Pushbutton on the SG-GCP Default Connection (Parent Connection & Credential Alias) Connection Tile to bring up the below Create Connection Dialog box:
Connection name: Enter a Name that will allow you to easily identity the Organization or Subset of Projects that you are connecting to, e.g. USAGOV for connecting to the USA Government Organization. This Name will be used as part of the naming convention for the newly created GCP Connection Specific Data Sources & Scheduled Import Jobs as per below.
| GCP Connection Specific Data Sources | Connection Name - Data Source Name |
| GCP Connection Specific Scheduled Import Jobs | Connection Name - Import Job Name |
Service Account Email, keystore, keystore password: Specify the Service Account Email, keystore and keystore password associated with the Organization being connected to.
Organization id: Specify the Organization ID associated with the Organization being connected to
Discovery Scope: Specify organizations to indicate that you are reading data from All the projects in the Organization. Again this would be projects if you are instead connecting to only a subset of projects in the Organization
(iii) Click on the Create and Get OAuth Token Pushbutton
- A new Child Connection & Credential Alias Record is created with the generated OAuth Token being saved to it.
This Child Connection & Credential Alias Record is associated with the Parent SG-GCP-Default Connection Connection & Credential Alias Record as shown in the below screen shot. We specified USAGOV for our example so USAGOV is shown as the Child Connection & Credential Alias below.
- A new Connection Name Service Graph Connection Record is created with the Field values from (ii) above. The screen shot below shows our USAGOV Service Graph Connection Record.
- A new set of GCP Connection specific Data Sources and Scheduled Imports are created that contain the Connection Name specified in the Create Connection Dialog Box. An example of GCP Connection specific Data Sources and Scheduled Imports that get created is shown below, where USAGOV was used to identify your GCP USAGOV Connection specific Scheduled Imports and Data Sources:
Test New Connection
Test the new Connection Name Service Graph Connection Record that was created in the previous step, e.g. the USAGOV Service Graph Connection Record
(i) Return to the Add Multiple Instances Guided Setup Step and Click on Configure to the right of Test New Connection to bring up the list of GCP Service Graph Connection Records.
(ii) Select the e.g. USAGOV Service Graph Connection and click on the Test Connection Related Link
--> You should see a Success Information message displayed at the top of the USAGOV Service Graph Connection Screen and the Status Field associated with the Connection should change from Pending to Success
Configure the Scheduled Imports
(i) Return to the Add Multiple Instances Guided Setup Step and Click on Configure to the right of Configure the Schedule Imports to bring up all the GCP Scheduled Import Jobs including the newly created GCP Connection Specific Scheduled Import Jobs
(ii) Navigate to the GCP Connection specific SG-GCP Organization Scheduled Import Job e.g. USAGOV--SG-GCP Organization and mark it Active
(iii) The GCP Connection specific SG-GCP Organization Scheduled Import Job (Parent Scheduled Import Job) is set to run Daily by default. If you want to change this specify at what schedule you want this Job to run.
E. Run GCP Service Graph Connector Scheduled Data Imports on your ServiceNow Instance
Before running these Scheduled Data Imports, I would recommend enabling CMDB 360 by setting the glide.identification_engine.multisource_enabled system property to True in System Properties.
Doing this allows the following for CI's that are Created\Updated by the Scheduled Data Import Jobs:
1. For CI's that have Reconciliation Rules, see Proposed Values for Lower Priority Discovery Sources that were Rejected
2. For CI's that allow more than 1 Discovery Source to update them (i.e. No Reconciliation Rules or Reconciliation Rules with same Priority), Identify the Source of an Attribute and see the Proposed Values for that Attribute from the other Discovery Sources.
Refer to the ServiceNow CMDB 360/Multisource CMDB Documentation page for more details.
(i) Navigate to Import Schedules under GCP in the Filter Menu. 36 OOTB Scheduled Data Imports should be listed, with all of them being marked Active as shown below. The Order Column shows the Order that the Import Jobs will run in (You need to Personalize your List Columns to include the Order column). Please refer to the ServiceNow Service Graph Connector for GCP Documentation Page for details on these Scheduled Import Jobs.
Note: It is recommended that you create a GCP Connection Specific Version of these Scheduled Data Imports as discussed in the Add Multiple Instances Guided Setup step of the above D. Configuring GCP Service Graph Connector on your ServiceNow Instance section. There will be 36 GCP Connection specific Scheduled Import Jobs created per Connection Specific Setup.
Open your SG-GCP Organization Parent Scheduled Import job record and click on the Execute button
(ii) Navigate to Concurrent Import Sets in the Filter Menu.
- Wait for your Active GCP Scheduled Data Import jobs to finish.
F. Analyze the CMDB Records created\updated by the GCP Service Graph Connector for your Linux VM in your ServiceNow Instance
There are 6 types of Records created by the GCP Service Graph Connector in the CMDB:
- CMDB CI[cmdb_ci] Records
- Software Installation[cmdb_sam_sw_install] Records - If Software Asset Management(SAM) enabled
- Software Instance[cmdb_software_instance] + Software Package[cmdb_ci_spkg] Records - If Software Asset Management(SAM) not enabled
- Running Process[cmdb_running_process] Records
- TCP Connection[cmdb_tcp] Records
- Key Value[cmdb_key_value] Records
CMDB CI Records
(i) Navigate to cmdb_ci.list in the Filter Menu
(ii) Group by Discovery Source
(iii) Navigate to the SG-GCP Discovery Source and double click on its Discovery source:SG-GCP(n) link where n represents the Number of CMDB records(entities) Created\Updated by the SG-GCP Service Graph Connector.
(iv) Group By Class
A List of CMDB CI Records Created\Updated by the SG-GCP Service Graph Connector will be displayed grouped by Class. The screen shot below shows the Class Records displayed in this Class List for the data that was ingested by the SG-GCP Service Graph Connector for our GCP Organization that includes CI's associated with our linux-ec2-vm Virtual Machine.
Note: For ServiceNow Instances that do not have Software Asset Management(SAM) enabled, you would see an extra Software Class listed for representing all the Software Package[cmdb_ci_spkg] Records that would have been populated by the SG-GCP Software Inventory Scheduled Data Import Job (referenced in the above D. Configuring GCP Service Graph Connector on your ServiceNow Instance section).
- The linux-ec2-vm Linux Virtual Machine is listed as a Linux Server CI along with it's associated linux-ec2-vm Virtual Machine CI.
- The Network Interface Card and Storage Mapping CI's associated with the linux-ec2-vm Linux Server are shown. These were populated from the Network Interfaces and Storage Entities associated with the linux-ec2-vm Virtual Machine Entity in Google Cloud Platform(GCP) described in the above C. Analyze your VMs in GCP section.
- The IP Address CI's (both Public and Private) and Availability Zone CI associated with the linux-ec2-vm Linux Server are shown. These were populated from the linux-ec2-vm Linux Virtual Machine Entity in Google Cloud Platform(GCP) that was shown in the VM Instances screen shot in the above C. Analyze your VMs in GCP section.
- The DeepDiscovery Cloud Service Account CI associated with the GCP DeepDiscovery Project, that our ServiceNow User snow-sgc-sa Service Account was provisioned in, is shown.
Note: The SG-GCP Default Connection.Credential Credentials Record (associated with the Parent SG-GCP Default Connection Connection & Credential Alias Record) is stored in this Cloud Service Account CI Record.
Notice how the Name given to the Cloud Service Account CI is the same name as the GCP DeepDiscovery Project that our ServiceNow User snow-sgc-sa Service Account was provisioned in.
- The deepdiscoveryvms.com Cloud Organization CI and it's associated DeepDiscovery Google Organization Project CI are shown.
linux-ec2-vm Linux Server
The screen shot below shows all the Linux Server Summary fields that were populated by the GCP Service Graph Connector for the linux-ec2-vm Linux Server CI. The following fields are circled:
- The Serial Number, Model ID, Manufacturer, RAM and CPU fields that were populated by the Deep Discovery Feature.
- The Operating System and OS Version Fields that were populated from the OS Info Tab of the linux-ec2-vm Virtual Machine in Google Cloud Platform(GCP) described in the above C. Analyze your VMs in GCP section.
Related Tabs
The screen shot below shows the Running Processes(86) and TCP Connections(17) that were populated by the Deep Discovery Feature. The Software Installed Records(365) were populated by the SG-GCP Software Inventory Scheduled Import Job.
Related Items
The screen shot below shows the Network Interface with associated Cloud Subnet, Storage Mapping and Availability Zone for the linux-e2-vm Linux Virtual Machine that came from the Network Interface, GCP Disk and Availability Zone associated with the linux-e2-vm Virtual Machine shown in the above C. Analyze your VMs in GCP Section.
Software Installation Records
Software Asset Management(SAM) enabled
For ServiceNow Instances that have Software Asset Management(SAM) enabled, the Software Install Records associated with Created\Updated Computer CI's will be ingested into the Software Installations[cmdb_sam_sw_install] Table.
Software Asset Management(SAM) not enabled
For ServiceNow Instances that do not have Software Asset Management(SAM) enabled, the Software Install Records associated with Created\Updated Computer CI's will be ingested into the Software Instances[cmdb_software_instance] Table along with associated Software Package Records being ingested into the Software Packages[cmdb_ci_spkg] Table.
Note: All that is needed to enable Software Asset Management is the free SAM Foundation plugin. Installing this plugin triggers the Software Install Records being populated into the Software Installations[cmdb_sam_sw_install] Table. Installing this free SAM Foundation plugin is a recommended Best Practice for customers that believe that they may be using Software Asset Management Professional (SAM Pro) in the future. These customers would then not have to migrate Software Records from the Software Instances[ cmdb_software_instance] Table to the Software Installations[cmdb_sam_sw_install] Table at the point in time that they would be installing Software Asset Management Professional (SAM Pro).
The Use Case outlined in this Article is for a ServiceNow Instance with Software Asset Management(SAM) enabled. To see the Software Install Records associated with Computer CI's that were Created\Updated by the SG-GCP Service Graph Connector, the steps below direct you to navigate to the Software Installations[cmdb_sam_sw_install] Table:
(i) Navigate to cmdb_sam_sw_install.list in the Filter Menu
(ii) Group by Discovery Source
(iii) Navigate to the SG-GCP Discovery Source and double click on its Discovery source:SG-GCP (n) link where n represents the Number of Software Install Records Created\Updated by the SG-GCP Service Graph Connector.
(iv) A List of Software Install Records Created\Updated by the SG-GCP Service Graph Connector will be displayed. The screen shot below shows the Software Install Records displayed in this List for the linux-e2-vm Linux Virtual Machine in our GCP Organization.
Notice that 365 Records are shown at the bottom of the screen. This matches the Software Installations (365) Count in the Software Installations Tab shown above for the linux-e2-vm Linux Server. It also matches the GCP Installed Packages (364) Count shown in the GCP Installed Packages screen shot for the linux-e2-vm Linux Virtual Machine (in the above C. Analyze your VMs in GCP Section) + the 1 Debian GNU/Linux 12 (Bookworm) Operating System Record that is Created by the GCP Service Graph Connector for representing the Operating System installed on the VM.
Running Process Records
(i) Navigate to cmdb_running_process.list in the Filter Menu
(ii) Search for the e.g. linux-e2-vm Linux Server in the Computer column
(iii) A List of Running Process Records for the Server being searched e.g. linux-e2-vm will be displayed.
The Screen shot below shows the Running Process Records displayed in this List for our linux-e2-vm Linux Server
Notice that 86 Records are shown at the bottom of the screen. This matches the Running Processes (86) count in the Running Processes Tab shown above for the linux-e2-vm Linux Server.
TCP Connection Records
(i) Navigate to cmdb_tcp.list in the Filter Menu
(ii) Search for the e.g. linux-e2-vm Linux Server in the Computer column
(iii) A List of TCP Connection Records for the Server being searched e.g. linux-e2-vm will be displayed.
The Screen shot below shows the TCP Connection Records displayed in this List for our linux-e2-vm Linux Server
Notice that 17 Records are shown at the bottom of the screen. This matches the TCP Connections (17) count in the TCP Connections Tab shown above for the linux-e2-vm Linux Server.
Key Value Records
The GCP Service Graph Connector creates the following types of Key Value Records in the Key Values[cmdb_key_value] Table for every Target CI it creates in the CMDB:
- Connection Type Keys - allow you to determine what Service Graph Connector Connection a Target CI is associated with
- Tag Type Keys - For Labels or Tags associated with a Virtual Machine in Google Cloud Platform(GCP)
Connection Type Keys
The GCP Service Graph Connector creates the following Connection Type Key Value Records in the Key Values[cmdb_key_value] Table for every Target CI it creates in the CMDB:
- serviceAccount
- discoveryScope
- connectionId
- organizationId
These Key Value Records hold Connection Property values associated with each respective Connection Property in the relevant Service Graph Connector Connection Record e.g. The OOTB SG-GCP-Default Connection Connection Record.
Tag Type Keys
The GCP Service Graph Connector creates a Tag Type Key Value Record in the Key Value[cmdb_key_value] Table in the CMDB for every Label or Tag associated with a Virtual Machine in Google Cloud Platform(GCP) . For example the linux-e2-vm VM shown in the linux-e2-vm screenshot in the Details sub section of the above C. Analyze your VMs in GCP Section contains a goog-ops-agent-policy:v2-x86-template-1-4-0 Label (Can be seen in the screenshot). A goog-ops-agent-policy:v2-x86-template-1-4-0 Key Value Record is created for our linux-e2-vm Server CI.
(i) Navigate to cmdb_key_value.list in the Filter Menu
(ii) Search for the e.g. linux-e2-vm Linux Server in the Computer column
(iii) A List of Key Value Records for the Server being searched e.g. linux-e2-vm will be displayed.
The below screen shot shows all the Key Value Records associated with the linux-e2-vm Server CI that were created for the corresponding linux-e2-vm Virtual Machine in Google Cloud Platform(GCP).
Notice the single goog-ops-agent-policy:v2-x86-template-1-4-0 Tag Type Key Value Record and the 4 Connection Type Key Value Records.
G. When to use GCP Service Graph Connector vs Cloud Discovery for monitoring your GCP VMs
ITOM Visibility (Horizontal Discovery + Cloud Discovery) is the recommended solution for populating the CMDB with Cloud based Resources like GCP Virtual Machines etc. ITOM Visibility (Horizontal Discovery + Cloud Discovery) requires a MID Server with connectivity to the Hosts (including Cloud based Resources) being targeted for discovery.
When to use the GCP Service Graph Connector for Discovering your GCP Resources
You should use the GCP Service Graph Connector for Discovering your GCP Resources for the below Use Cases:
- You don't want to have a MID Server as a requirement for your overall Solution Architecture
- You don't want to (or can't) use ITOM Horizontal Discovery in your overall Solution Architecture.
- You don't want to (or can't) use Agent Client Collector for Visibility in your overall Solution Architecture.
- You want the below data to be populated in the Target CI's that get created:
- Installed Software running on GCP Virtual Machines
- Running Process & TCP Connections on GCP Virtual Machines
Cloud Discovery provides the ability to get High Level GCP Virtual Machine Metadata only. For cases where Horizontal Discovery and Agent Client Collector for Visibility are not options, but you need to get Installed Software, Running Process or TCP Connection data from your GCP Virtual Machines, then the GCP Service Graph Connector is recommended.
When to use ITOM Visibility (Horizontal Discovery + Cloud Discovery) for Discovering your Azure Resources
You should use ITOM Visibility (Horizontal Discovery + Cloud Discovery)for discovering your GCP Resources when you want the richest set of data in the CMDB, the most capabilities, and have the ability to obtain the necessary credentials and network connectivity.
