- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2015 10:16 AM
Hello,
New to Servicenow. I have a daily feed coming in of users. It was setup by our SericeNow partners, it was running in 1-3 hours now takes up to 10+ hours to run. We had a new field added to the csv called Last updated date. It has the date and time the record was last updated. Is there a way to create maybe a onbefore script that says if the date was yesterday or today load the record else don't. This way only newly updated records come in and this will save many hours or processing time. I did have ServiceNow support look at this and they don't know why the time jumped up to load the records from the staging table into the new table after the transform. Any help is greatly appreciated
thank you
Solved! Go to Solution.
- Labels:
-
Integrations
-
Scripting and Coding
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2015 11:16 AM
I think you have the right idea - you can create an onBefore Transform Script (Transform Map Scripts - ServiceNow Wiki ) that will validate each row to decide whether to insert/update. You will see in the link above there is an ignore = true attribute that, when set in an onBefore Transform Script, will ignore the entire row. Therefore, your condition will evaluate the date and when it evaluates outside today/yesterday, set ignore = true; otherwise proceed.
I would be curious though as to why the CSV cannot be refined at the source? It would seem that would solve all the problems rather than relying on ServiceNow to parse through the spreadsheet for you. As well, the time your load takes seems odd. It only takes 15-20 minutes to go through our LDAP user load (over 180,000 users) when we run it so I really wonder how big this spreadsheet is you are using.
Also, you may want to look into Data Sources (http://wiki.servicenow.com/index.php?title=Data_Sources#gsc.tab=0). We use Data Sources to retrieve data directly from the data base, and this allows us to use a query so we can filter the data before it ever comes to ServiceNow. Perhaps this is an option?
There would be a couple other ways to address this (e.g.: through coalescing, perhaps a business rule to prevent insert of entries outside the date range into the table in the first place, etc.), but the above directly answers your question about how to do it from within your Transform Map.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2015 11:16 AM
I think you have the right idea - you can create an onBefore Transform Script (Transform Map Scripts - ServiceNow Wiki ) that will validate each row to decide whether to insert/update. You will see in the link above there is an ignore = true attribute that, when set in an onBefore Transform Script, will ignore the entire row. Therefore, your condition will evaluate the date and when it evaluates outside today/yesterday, set ignore = true; otherwise proceed.
I would be curious though as to why the CSV cannot be refined at the source? It would seem that would solve all the problems rather than relying on ServiceNow to parse through the spreadsheet for you. As well, the time your load takes seems odd. It only takes 15-20 minutes to go through our LDAP user load (over 180,000 users) when we run it so I really wonder how big this spreadsheet is you are using.
Also, you may want to look into Data Sources (http://wiki.servicenow.com/index.php?title=Data_Sources#gsc.tab=0). We use Data Sources to retrieve data directly from the data base, and this allows us to use a query so we can filter the data before it ever comes to ServiceNow. Perhaps this is an option?
There would be a couple other ways to address this (e.g.: through coalescing, perhaps a business rule to prevent insert of entries outside the date range into the table in the first place, etc.), but the above directly answers your question about how to do it from within your Transform Map.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2015 11:28 AM
Trevor,
Thank you, for this feed they won't update the csv file at the source or give us table access like other feeds. I am stuck with 75,000 records nightly + a few new ones each night. The reason it takes a while there are a few scripts that run to set the userid field correct, primary domain field and to scramble the password field. All of this takes time and that is why I was looking at using the last updated field and only run on those. Not a great choice but since I can not get them to change or grant me access, this was the only option available so far. thank you for the points above
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2015 11:31 AM
ignore = true sounds like a good way to go then - it will at least stop processing all the records you don't want to update.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2015 11:34 AM
My problem is the script not working right saying if last updated is equal to today's date - 1 day. That is what I am currently trying to figure out, at least I know I am going in the right direction
thanks