Welcome to Community Week 2025! Join us to learn, connect, and be recognized as we celebrate the spirit of Community and the power of AI. Get the details  

Transform script suggestion

sainath3
Giga Guru

Hello all,

I'm seeking your input on a scenario we're currently handling.

 

Current Flow

  • A third-party Talend application sends 5,000 records daily to a ServiceNow (SNOW) transform map.

  • In the onBefore script, we perform validation:

    • If any field is empty, we create an Incident (INC).

Sample Data

Product Number Source Reference ID State Private Key
2345LinkDin2345LinkDin#abcOpen987098
8987Noukri8987Noukri#efgResolved453672
1212Freshers1212Freshers#asdfOpen543878
 

As shown, the Source field can be one of: LinkDin, Noukri, or Freshers.

 New Requirement

  • Talend will continue sending records in the same format, but:

    • All records will have Source = Noukri.

    • Most fields will contain inaccurate data, except for the Private Key.

  • We need to:

    1. Extract the Private Key.

    2. Send it to another third-party application.

    3. If valid data is returned, update the record with correct Source and other fields in import set table.

    4. If data is not returned keep the same data in import set.
    5. Perform validation as usual.

 Proposed Solution

  • Use a Script Action triggered by the onStart script:

    • Fetch all 5,000 records from import set.

    • Send them to the third-party application.

    • Based on the response, update the Import Set records.

    • Let onBefore/onAfter scripts handle validation and transformation.

 Challenge

  • Processing all 5,000 records in the onStart script is time-consuming.

  • This may cause delays or stuck records in the Import Set, potentially cancelling the job.

 Request

Could anyone suggest a more efficient approach to handle this scenario? Open to ideas around batching, asynchronous processing, or architectural changes.

Thanks in advance!

 

Would you like me to help draft a technical solution proposal or explore optimization strategies for this workflow?

1 ACCEPTED SOLUTION

Ankur Bawiskar
Tera Patron
Tera Patron

@sainath3 

Doing the validation on chunk of 5000 records in onStart transform script will lead to performance impact as you will call 3rd party API for those 5000 records and then perform the further processing.

I will suggest to inform the 3rd party team if they can send the accurate data and they can check the data in external API rather than having it done in ServiceNow platform.

💡 If my response helped, please mark it as correct ✔️ and close the thread 🔒 — this helps future readers find the solution faster! 🙏

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader

View solution in original post

4 REPLIES 4

Ricardo26
Giga Guru

Hi,

From what I undestand,  you don't need to send all the 5000 private keys in the imports. You actually just need to send the request to the third-party system when the private key is updated or a new record is inserted.

Now if the api supports getting the data of many private keys at once, you can create an array on a onStart script, and in an onAfter script make the comparassion between the current and previous private keys. If they are different, add in the array. Then in a onComplete, triggers the integration and update the fields.
Other option could be create an async business rules that will trigger the integration when the private keys changes.

Ankur Bawiskar
Tera Patron
Tera Patron

@sainath3 

Doing the validation on chunk of 5000 records in onStart transform script will lead to performance impact as you will call 3rd party API for those 5000 records and then perform the further processing.

I will suggest to inform the 3rd party team if they can send the accurate data and they can check the data in external API rather than having it done in ServiceNow platform.

💡 If my response helped, please mark it as correct ✔️ and close the thread 🔒 — this helps future readers find the solution faster! 🙏

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader

Thanks  Ankur,

 

As mentioned, I've reached out to the 3rd party team about the SNOW performance issue, and they've provided some filters. After applying these filters, the number of records has been reduced to 1000. However, the process we're following seems to have a recurring issue.

 

On day one, when we receive around 1000 records from the 3rd party, approximately 950 records are processed, and the remaining 50 are ignored. The following day, the 3rd party processes around 1000 records again, but 90% of the data is the same as the previous day.

 

Here's the process I'm currently following. I would appreciate it if you could review it and let me know if there are any areas for improvement or if I am missing something:

 

Step 1:

 'Onbefore' business rule on the import set table:

- Filter 1: Based on the filter provided by 3rd party, mark the flag as true. The records would now be 1000.

- Filter 2: Check if the unique key is available in the target table. If it is available, mark the flag as true; otherwise, mark it as false. This results in just 50 records.

 

Step 2:

On the 'OnStart' transform script:

- Send those 50 records to the 3rd party with an API call. Based on the result, update or create the record in the import set table with accurate data.

- Then, the 'Onbefore' and 'Onafter' scripts will execute as they did previously.

 

Could you please review this process and let me know if there are any improvements or adjustments needed? Your input and suggestions would be greatly appreciated.

 

Thank you for your attention to this matter. I look forward to your feedback.

@sainath3 

looks fine for me

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader