Join the #BuildWithBuildAgent Challenge! Get recognized, earn exclusive swag, and inspire the ServiceNow Community with what you can build using Build Agent.  Join the Challenge.

Upload records to S3

vishwanmnai
Tera Contributor

Hi, We have a requirement to upload records from a log table to S3 bucket. The integration works fine and I am able to upload it. But since records can be either 100k or more, it takes a long time to do it. In sub-prod we are achieving speeds of 84k records in 2.5 hrs. I am following a sync model but I created a sys_trigger record set to run daily on "active nodes". Even then, the speed is quite slow. 

The script include called by sys_trigger queries the log table and so I am assuming that this is an inefficient way to deal things, because how does ServiceNow know whether a record was uploaded by the job in another node or not?

 

Do you have any other way of doing things? I thought about creating events for each record in the table with the sys_id of the record captured in parm1 and then a BR will capture this sys_id and upload the record to S3. Is this feasible or do you guys have any other idea?

0 REPLIES 0