Hast anyone successfully used "Use last run datetime:" on Import Data Source

hartr
Giga Contributor

I'm trying to query a large Oracle database using the Use last run datetime: function to get incremental data. The problem I have is that if I manually populate the Last Run datetime field with "to_date('2011-11-02','yyyy-mm-dd')" then the query works (although only if I query all records, not if I specify the SQL), however the Last Run datetime field doesn't then update with the date of the latest record in the previous run as it's supposed to.

Has anyone got this feature working ? Or know what controls how the Last run datetime: field is updated ?

Thanks

10 REPLIES 10

dh4234
Kilo Expert

The last run time value is create by SN each time the import runs. It does not write any date to your Oracle database. An issue that I have seen is that each time a record is updated the date and time from that update is not recorded in the Oracle database.


hartr
Giga Contributor

Got this to work with help from SNC - the field name was in the wrong case. So now the lastrun date is being populated.
My only remaining problem is that the lastrun function only works when doing a select all rows query, not with a specific SQL query.
Would still like to know what is constructing the SQL query when using the lastrun function and whether it can be customised.


The only area that I am aware of that lets you do some customization around "last run datetime" is the business rule with the following name: "Format last run datetime".

This lets you format the date time string a bit.


I found this post when we started changing some of our processes to use Last Run Datetime, so I thought I'd add a few hints to those that find it also.

You can use a specific SQL query and the Last Run Date options. The key is, the only thing the compiler is doing is adding 'WHERE [DateField] >= [DateTime]' to the end of whatever query you write. Because of this, you can't have a WHERE clause at the end of the query. We solved this by putting the filters we wanted in a sub query. Not ideal, but the database we're hitting can handle it better then all that extra data hitting Service Now.

The other thing we weren't happy about is the '>=' in that query. With the nature of the data pulls we've converted, our sources are also only updated once daily, and all have the same update time. This means that each day it's pulling yesterdays data and todays. However this is just a minor inconveince, and I know that with some sources you might miss a record or two with just the '>'.