The CreatorCon Call for Content is officially open! Get started here.

Improving Workflow Automation for Dynamic Menu Data in ServiceNow

tanveerceo7
Kilo Explorer

Hey everyone,

I’ve been experimenting with automating dynamic data updates in ServiceNow using a custom workflow triggered by external APIs.
As a practical test, I simulated an integration with a dataset similar to what a company like Dunkin Donuts Menu might use — frequently changing items, pricing, and availability.

Here’s the setup:

  • 🔄 API integration through a scripted REST API in ServiceNow.

  • 🕒 Scheduled job every hour to sync menu changes.

  • 📊 Data stored in a custom table for menu analytics (items, categories, timestamps).

While the automation works well for smaller updates, I’m seeing noticeable delays when syncing larger datasets (e.g., image-heavy content or nested JSON).

So, my questions for the community are:

  1. What’s the best practice for optimizing ServiceNow REST API calls for large payloads?

  2. Has anyone tried queue-based data processing (like Flow Designer + IntegrationHub) for similar use cases?

  3. Is there a better caching or delta-sync strategy within ServiceNow to reduce load times?

Would appreciate insights or examples from anyone who’s tackled similar large-scale external data integrations.

Thanks in advance!

0 REPLIES 0