- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
As an Administrator, from time to time you need to perform imports into a ServiceNow instance (most admins do it regularly). Data flows from an external datasource (Database, flat file, FTP) into a staging table in ServiceNow Database. The staging table should be configured with at least one transform map (javascripts) to convert the data from staging table into the target table (CMDB, incident, you name it). A few simple changes can help you to get the most out of import sets and improve the run time of the imports.
If your imports are slower than I expected, and you have checked the transform scripts and disabled all the business rules, workflow etc. Here are 3 tips for what to do next:
Make sure we have a table cleaner running on the import set table.
ServiceNow has an out-of-box scheduled system job to maintain the import set table named "Import Set Deleter". If the cleaner job is disabled for any reason, the import set table will grow indefinitely. Sooner or later, it will cause problem at the database level and impact the whole instance when the import is running.
Keep in mind that the import set table only stores the import set meta info. The raw data to be imported are stored in a staging table which is extending the import set row table. So when the cleanup job execute, it will not only delete import sets older than 7 days, it will also delete all the raw data and transformation history associated with that import set, which can be millions of rows for each import set.
This is the snapshot of the "Import Set Deleter" job. We need to make sure it is executed every day:
Further reduce the staging table.
If you have nightly imports, you may find the import set row table remains at multi million rows, even after daily clean up. That is because: by default, the cleaner is configured to only delete data older than 7 days. We can achieve better performance at the database level by simply reducing the amount of data to keep in the staging table.
This is the example configuration to clean up more aggressively, we only keep 2 days of historical data in the import set table, that means any import set older than 2 days will be purged as well as the associated data.
Check for inflated import set row table.
"I only imported 250k items last night, but my import set table row count increased 2.5 million, Why?" It is possible that you have too many transform maps defined for that import set.
Every imported item needs to be transformed. Each transform map will insert one row in the import set row table for that item, even it is not the right map for the item, the row will just show as 'Ignored'.
Let's look at this example: We imported 26k database servers from one file. We have 11 transform maps defined on that table, each map should work for a different type of CI. We ended up inserting 288k rows into the import set row table with 90% of them to be only 'Ignored'. If you run this a few times, the import set row table will be too big very soon.
To achieve optimal performance, we want to keep the import set row table less than 5 million rows at all times. In order to change this behavior, we would like to use as less transform maps as possible for each staging table/import set. Ideally, the import set and transform map should be on 1:1 relationship. In this example, we can either split different imported server into different staging tables (source tables) or we can consolidate all the transform maps into one big script.
- 4,111 Views
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.