WHY convert existing NLU based virtual agent topics to LLM based?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
We have around 500 NLU based virtual agent topics.
Customer is asking us to migrate all of them to LLM based topics.
I am not getting clarity on WHY to convert existing NLU based virtual agent topics to LLM based? What benefit will I get if I convert?
PS - Request you to reply only if you have experience/worked on this MIGRATION topic.
TIA!!!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
So when a customer asks to migrate all ~500 NLU topics, in my experience it’s almost never because every topic truly needs to be LLM-based really. It’s usually a signal that the NLU estate has become painful to manage.
I’ve seen this happen a few times now. Once you cross a few hundred topics, teams are spending more time maintaining intents than improving the Virtual Agent experience. Utterannce tuning overlap between intents retraining and missed matches become a constant background task. The migrate all 500 request is often really saying:
“We’re tired of babysitting NLU.”
Another big driver is discovery. With that many topics, users don’t phrase things the way the intents expect. Even well-built NLU models start to miss, and users hit fallback too often. Customers always hear that LLM-based routing handles natural language better and assume converting everything will fix discovery across the board.
There’s also usually topic sprawl if you know what I mean... In every large one Ive ever worked with only a fraction of the topics actually get meaningful traffic. The rest exist because they were added over time and never retired. LLM feels like a way to avoid cleaning that up.
That being said, when we actually implemented migrations, we never moved all topics at once. The Team and I focused on:
High-volume, discovery-heavy topics
Topics where NLU tuning was clearly not paying off
Areas with lots of phrasing variability or multi-language usage
Highly deterministic flows like password resets simple catalog orders stayed on NLU initially because they were already working fine and didt gain much from LLM routing.
So when a customer pushes for 500, my response would be usually something like this:
“We can migrate a large number, but let’s be clear on the goal. In past implementations, migrating a smaller, targeted set delivered most of the value without the risk and effort of a full estate move.”
@Suggy - Please mark Accepted Solution and Thumbs Up if you found Helpful!!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi there @Suggy
From hands-on migration experience, there is no inherent or technical requirement to convert existing, well-performing NLU-based Virtual Agent topics to LLM-based topics. NLU topics are deterministic, cost-effective, and easier to govern, which makes them ideal for high-volume, structured, and compliance-sensitive workflows. LLM-based topics add value mainly when there is a need to handle highly unstructured, free-form user input, reduce ongoing intent-training effort, or support more contextual and conversational interactions. Migrating all topics purely for modernization delivers limited ROI, introduces higher cost and governance complexity, and can reduce predictability. A hybrid approach—retaining NLU where it works
But as always try to explain the tradeoffs, but ultimately customer/client wins the argument haha
Kind Regards,
Mohamed Azarudeen Z
Developer @ KPMG
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Hi @Its_Azar
My understanding is that, the LLM assistant will use only the LLM topics and also there is no fallback to NLU topics.
So any topics which did not get converted or had some issues, they will not be surfaced on the VA. and also since there is no fall back to NLU, there will be a gap as we cant use both NLU and LLM topics at same time.
Isnt this an issue?
