Join the #BuildWithBuildAgent Challenge! Get recognized, earn exclusive swag, and inspire the ServiceNow Community with what you can build using Build Agent.  Join the Challenge.

Scheduled Job Memory Drain Problems

JayGervais
Kilo Sage

I am wondering if anyone has some advice on this situation?

I created a schedule job to import laptop data for our organization (currently in Test). We have around 50k devices. The job is scheduled to run once/week on a Friday to avoid slowing down the instance during work hours. It works be calling a REST Message to the Google API to collect data and uses an import table to copy the devices to the cmdb_ci_computer table. There is also an onAfter script, which updates the states and substates on the alm_hardware table for the associated record.

This has been working correctly but it runs non-stop and takes nearly a week to get through all the records. This triggered a P2 ticket for draining the memory of our instance too, but I was not able to get advice from ServiceNow due to this being a custom creation.

Has anyone done anything like this before who could offer advice, or is there a better way to import large amounts of data than what I am doing?

5 REPLIES 5

AshishKM
Kilo Patron
Kilo Patron

Hi @JayGervais

We rarely gets advice on custom items. 50K is not too  big. but seems like processing at ServiceNow side is taking long  time due to transform map scripts. 

Can you break total record in two smaller set for 2 separate jobs. ( just thinking ) and run for result.

 

-Thanks,

AshishKM


Please mark this response as correct and helpful if it helps you can mark more that one reply as accepted solution

Thanks Ashish. I will see if I can figure out a way to separate it.

James Chun
Kilo Patron

Hey @JayGervais,

 

50k doesn't sound too big (unless individual records are big) and running for a week is very odd.

Can you check the following:

  • How long does it take to finish invoking Google API (i.e. how long does it take to import all records into the import set)
  • How long does it take to finish the data transformation 
  • Is there a need to import all 50k devices every week? Is it possible to add a filter such as retrieving only the ones that are updated/created?

Cheers

 

Thanks James. I agree that 50k doesn't seem like it should be too much. I could possibly filter but I do need all of them to track state changes. I could maybe ignore devices that are already retired though. Thanks for the comment!