
- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
on 11-10-2021 01:37 PM
##########
# 10/11/2021 #
##########
In this article, I would like to cover how I used the ODBC driver to synchronize ServiceNow data tables to SQL Server and delta load new/updated records to keep the SQL Server tables up to date on a 15 minute cycle.
This developed as needed over a 5 year period, from the first basic standard data dumps to the final bi-directional anto-recovery system it developed into. Several major redevelopments took place with a year or more between redevelopments. I thought it would be a waste to not share the techniques I found and used.
I was asked to use Qlikview to synchronize the ServiceNow data tables, as my whole team knew how to use and maintain dashboards written and scheduled with Qlikview. However, the basic principles can be used in other languages like SSIS Script component ScriptingLanguage C#.
I will try to stay away from the Qlikview side as much as possible and focus more on the DataSource(ServiceNow ODBC) and DataDistribution(SQL Server) sides, however there is some MiddleWare processing to be done to the data before it is passed to the DataDestination(SQL Server) to make sure it is accepted correctly. This will be shown in Qlikview, but should be easily translated to other languages.
For Example: The handling of quotes in data fields can cause problems when data is passed through to the DataDestination(SQL Server) without any MiddleWare(Qlikview) alteration. So the MiddleWare(Qlikview) needs to work around this limitation before passing the data to the DataDestination(SQL Server) to work around the problem quotes in data fields cause.
I will try to start with a basic simple data extract system for people who just want something simple that works and move onto the more complex bi-directional systems later for those interested in the more complex auto recovery systems and less maintenance, ideal for multi table or multi instance configurations.
######################################
# 11/11/2021 # Basic simple data extract steps for a table. #
######################################
1) FromDate
//1) FromDate
//Configure the date you want to extract the data from.
//This should be in "YYYY-MM-DD HH:MM:SS" format and assigned to FromDate.
LET FromDate = '2020-02-28 09:00:00';
2) TableName
//2) TableName
//Set the table name into a string variable.
LET TableName = 'incident';
3) FieldNameList
//3) FieldNameList
//Create a comma delimited list of field names to collect.
//Make sure to wrap each field name in square brackets.
//Make sure to include the [sys_id] and [sys_created_on] fields.
LET FieldNameList = '[sys_id], [sys_created_on], [sys_updated_on], [number], [short_description]';
4) Create the ServiceNow Sql Query for collecting data over the ServiceNow ODBC
//4) Create the ServiceNow Sql Query for collecting data over the ServiceNow ODBC
//Create the SQL statement into a string, ready to be passed and excecuted on the ServiceNow system.
LET SnowSqlQuery = '
SELECT $(FieldNameList)
FROM [$(TableName)]
WHERE [sys_created_on] >= ''$(FromDate)''
';
5) Connect to ServiceNow using ODBC connection.
//5) Connect to ServiceNow using ODBC connection.
//Connect to the configured ServiceNow_ODBC_Instance you have configured on your ODBC connection.
ODBC CONNECTION32 TO ServiceNow_ODBC_Instance (XUserId is EXAMPLE_XUSERID, XPassword is EXAMPLE_XPASSWORD);
6) Execute SnowSqlQuery and collect data into memory
//6) Execute SnowSqlQuery and collect data into memory
Quality *;
incident:
LOAD *;
SQL $(SnowSqlQuery);
7) Export to QVD file
//7) Export to QVD file
//Export to local path QVD file using TableName.
STORE incident into [c:\incident.QVD];
Please Note: This is from memory. If you find any problems, please let me know and I will do my best to correct it.
Notes: This is a very basic data extract from the incident table and worked well for a short time. I extracted data in this way from multiple tables in an overnight process for the first year or two, before finding the data was getting so big, a nightly extract like this wasn't good enough. So this is a quick, simple extraction that can quickly become very large and take a very long time to extract. I decided to re-code this so the data would be delta loaded and have it refresh more often to give us near live data throughout the day.
Next: I will cover a much larger and more complex example of doing this using a delta load, SQL server and with bi-directional synchronization of data in smaller chunks to allow tables to update quickly. This will be a very big section, so I may split it down into the steps, then visit each step in greater detail with code and notes.
- 2,352 Views