Gen AI Controller output cuts off despite recursive summarization
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2024 02:14 AM
Hi all,
I'm working with the Gen AI Controller on a Xanadu instance, trying to follow the steps of this blog:
https://developer.servicenow.com/blog.do?p=/post/generative-ai-controller/
Additionally, as I expect the output of the LLM to be large, I have enabled recursive summarization as detailed in the documentation here (also see screenshot):
https://www.servicenow.com/docs/bundle/xanadu-intelligent-experiences/page/administer/generative-ai-...
Nonetheless, the output of my LLM is cut-off/incomplete when testing the Generate Content functionality in Workflow Studio (see screenshot, should be a full json object but only about half is given).
I know I can get the full output because I have another version which uses an Outbound REST request that is used in a business rule script. However, ideally we would want to replace that with the OOB functionality of the Gen AI Controller.
Does someone have an idea how I can actually get and store the full output of the LLM?
Cheers
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-11-2025 03:56 PM
I don't have a solution to your problem, but wanted to get your thoughts on why you want to migrate to using Gen AI Controller. I struggle with the value prop of Gen AI Controller as write a simple outbound REST call is much eaiser than configuring for Gen AI Controller.