How to access a jdbc data source, start a loop -change the sql, trigger the data source and the associated transform map in a "fix" script, this is to create pagination - to avoid getting the "max. payload error" that we are getting - please answer w

sundar15
Kilo Contributor

When I try to execute a jdbc data source - it is failing with "max payload error". So I have to create a fix script (recommened by our servicenow contact) which should bring in limited records at a time - and load them to our table in a loop till all records are loaded (using fetch and offset SQL options). To do this I should be able access the data source in the script (I think I can do this (using sys_data_source table)) - start a loop - in each iteration I have to alter the sql - execute the data source to pull in data from oracle, and execute the transform map associated with the "data source" to load the data to our service now table - the loop should continue till all records from the SQL result has been uploaded to our table.

4 REPLIES 4

Maik Skoddow
Tera Patron
Tera Patron

Hi

can you please modify your question in way which can be understood?

And don't describe your solution but explain what your requirement / goal is.

Thanks and kind regards
Maik

DrewW
Mega Sage
Mega Sage

Last I checked you need to create multiple import sets to do something like this.  The system has a max file size and a row count for an import to break it up.  So I'm wondering why they are proposing this as a solution.

On a side note I have pulled 15.8 million rows from a database and just increasing the available JVM memory to 14gb fixed issues I was having with it running out of memory when creating all of the files there were needed for the import.  So I'm wondering how much data you are trying to get and what the content of the data is since I know you can get very large sets of data without resorting to this kind of thing.

sundar15
Kilo Contributor

I am trying to import an oracle table with a blob column using jdbc data source and when I click "Load All records" - I am getting the following error (Note: "load 20 rows" works fine.) - also I think it is not the number of records - it is the size of these Blobs - that is causing the error.

find_real_file.png

 

Have you tried creating two different imports.  1 that gets all of the data but the Blob column and the other gets the Blob column?

Docs say this
https://docs.servicenow.com/bundle/rome-platform-administration/page/administer/import-sets/reference/r_MaximumRowSize.html

But your screen shot has a diff byte size limit.  Have you looked to see if that limit can be changed?

 *** Edit
It just occurred to me that this limit maybe the attachment limit set on the system.  Try adjusting that to see if the error goes away or the encoded byte limit changes.