Improving Workflow Automation for Dynamic Menu Data in ServiceNow
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tuesday
Hey everyone,
I’ve been experimenting with automating dynamic data updates in ServiceNow using a custom workflow triggered by external APIs.
As a practical test, I simulated an integration with a dataset similar to what a company like Dunkin Donuts Menu might use — frequently changing items, pricing, and availability.
Here’s the setup:
🔄 API integration through a scripted REST API in ServiceNow.
🕒 Scheduled job every hour to sync menu changes.
📊 Data stored in a custom table for menu analytics (items, categories, timestamps).
While the automation works well for smaller updates, I’m seeing noticeable delays when syncing larger datasets (e.g., image-heavy content or nested JSON).
So, my questions for the community are:
What’s the best practice for optimizing ServiceNow REST API calls for large payloads?
Has anyone tried queue-based data processing (like Flow Designer + IntegrationHub) for similar use cases?
Is there a better caching or delta-sync strategy within ServiceNow to reduce load times?
Would appreciate insights or examples from anyone who’s tackled similar large-scale external data integrations.
Thanks in advance!