Datawarehouse pushing sys_user hr_profile Servicenow

pokai
Tera Contributor

I've been tasked to document as is processes in Servicenow and have done most of the ITSM modules. Experience wise I am familiar with MID Server and LDAP with sys_user but yet to touch hr_profile and HR modules.

Our current client servicenow is express and uses Datawarehouse tables which pushes to both sys_user and hr_profile.
I can see business rules against the hr_profile table that seem to process the data coming in but the code is way over my skill level as an admin. I can see a transform map that translate data to target sys_user data. Unfortunately whoever built this push to servicenow has left the company and the architect for dwh has no idea on the details being pushed to servicenow besides where the staging table is.

I can see the credentials being used in by DWH in the sys_user table and the created by against records in sys_user
Is there anything I can suggest or look into that would help us identify this?

1 ACCEPTED SOLUTION

HIROSHI SATOH
Mega Sage

Here are a few steps you can take to help identify and document the process:

  1. Review Business Rules and Scripts:

    • Even if the business rules against the hr_profile table are complex, start by documenting them. Identify any key scripts or functions that seem to be responsible for data processing.
    • Look for any specific fields or conditions that the business rules are triggering on. This could give you insights into how data is being transformed or processed.
  2. Analyze Transform Maps:

    • Since you mentioned there is a transform map in place, thoroughly review it to understand how data from the staging table (or any other source table) is being mapped to sys_user and hr_profile.
    • Pay attention to field mappings, scripts in the transform map, and any conditions that are applied.
  3. Investigate Import Set Tables:

    • Check the import set tables and the staging tables used in the transform process. This can give you clues about the source data and how it's being imported.
    • Look for any scheduled jobs or automated imports that are linked to the transform maps.

 

Other:

 

Your predecessor may have set it to output logs during processing. Please check the logs.

 

Please identify the update time and updater information from tables that you know are synchronized. I think it will help you understand the logs and the processing.

 

If you don't know the contents of the script that is being executed, it may be a good idea to analyze the script with ChatGPT or something similar.

 

I hope this helps.

 

 

 

View solution in original post

1 REPLY 1

HIROSHI SATOH
Mega Sage

Here are a few steps you can take to help identify and document the process:

  1. Review Business Rules and Scripts:

    • Even if the business rules against the hr_profile table are complex, start by documenting them. Identify any key scripts or functions that seem to be responsible for data processing.
    • Look for any specific fields or conditions that the business rules are triggering on. This could give you insights into how data is being transformed or processed.
  2. Analyze Transform Maps:

    • Since you mentioned there is a transform map in place, thoroughly review it to understand how data from the staging table (or any other source table) is being mapped to sys_user and hr_profile.
    • Pay attention to field mappings, scripts in the transform map, and any conditions that are applied.
  3. Investigate Import Set Tables:

    • Check the import set tables and the staging tables used in the transform process. This can give you clues about the source data and how it's being imported.
    • Look for any scheduled jobs or automated imports that are linked to the transform maps.

 

Other:

 

Your predecessor may have set it to output logs during processing. Please check the logs.

 

Please identify the update time and updater information from tables that you know are synchronized. I think it will help you understand the logs and the processing.

 

If you don't know the contents of the script that is being executed, it may be a good idea to analyze the script with ChatGPT or something similar.

 

I hope this helps.