- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
3 weeks ago - edited 2 weeks ago
Overview
The Optimize GRC Issue Resolution use case works out of the box, but every organization has different data volumes, resolution patterns, and process expectations. This article covers every supported customization — from tuning which past issues are used as reference data, to adjusting how the action plan and remediation tasks are generated, and modifying agent behavior and messaging.
The two agents and their skills are accessed from:
- Usecase: All → Now Assist AI Agents → AI Use Cases
- Agents: All → Now Assist AI Agents → AI Agents
- Skills: All → Now Assist Skill Kit → Skill
- Indexed Sources: All → Indexed Sources
Quick Reference
|
What you want to change |
How |
|---|---|
|
Edit the Issue closed filter condition in the GRC issues indexed source |
|
|
Edit |
|
|
Edit |
|
|
Edit instructions on the Issue Action Plan AI Agent and Remediation Tasks AI Agent |
|
|
Enrich the search query in the GetSimilarRecords tool of the Issue Action Plan Generation skill |
|
| How to apply additional filters while performing RAG search on closed issues |
Add dynamic filter conditions in the GetSimilarRecords tool of the Issue Action Plan Generation skill |
1. Adjusting Which Issues Are Used as Reference Data
The action plan generation only draws from issues that are closed. By default, this means issues in Closed Complete or Closed Incomplete state. You can change this filter to expand or narrow the pool of reference issues.
Navigate to
All → Indexed Sources → search for GRC issues → open → find the Issue closed filter condition → edit.
Current filter
state = 3 (Closed Complete) OR state = 4 (Closed Incomplete)
What you can safely change
- Add additional closed states if your organization uses custom issue states
- Add further conditions to narrow the pool — for example, restrict to issues from a specific category or assigned group
2. Adjusting the Similarity Threshold
The similarity threshold controls how closely a past issue must match the current one before it is included in the comparison set. The default is 80%. A past issue that scores below this threshold is excluded regardless of how many results are available.
Navigate to
All → Now Assist Skill Kit → Skill → search for Issue Action Plan Generation → open → Tools → open GetSimilarRecords tool → edit the script.
Example snippet
var SCORE_THRESHOLD = 0.80; // default — lower to include more results, raise for stricter matching
|
Value |
Effect |
|---|---|
|
0.70 |
More past issues included — useful when data is sparse but may introduce loosely related patterns |
|
0.80 (default) |
Balanced for most environments |
|
0.90 |
Only very closely matched issues included — higher precision but may return nothing on new issue types |
3. Adjusting the Number of Past Issues Retrieved
By default, up to 20 past issues are retrieved and passed to the AI model for synthesis. All retrieved issues must also meet the similarity threshold — so the actual number sent to the model may be lower.
Navigate to
All → Now Assist Skill Kit → Skill → search for Issue Action Plan Generation → open → Tools → open GetSimilarRecords tool → edit the script.
Example snippet
var MATCH_COUNT = 20; // default — increase for richer synthesis, decrease to reduce latency
4. Editing Agent Instructions
Each agent has its own instruction set that controls its conversational behavior — the messages it displays, how it presents options to the user, and how it handles each user choice. These are separate from the skill prompts and can be updated independently.
Issue Action Plan AI Agent
Navigate to:
All → Now Assist AI Agents → AI Agents → search for Issue action plan AI agent → open → edit the Instructions field.
|
Part |
What you can change |
|---|---|
|
Accept / Edit / Dismiss prompt |
The wording of the options presented to the user after the action plan is shown |
|
Acceptance message |
What the agent says when the user accepts — including the note about the plan being saved to the issue record |
|
Dismissal message |
What the agent says when the user dismisses — currently tells the user to populate the plan manually |
|
Edit refinement behavior |
How the agent asks for and incorporates feedback when the user chooses to edit |
|
Error messages |
What the agent says when the issue record cannot be found or the plan cannot be generated |
Remediation Tasks AI Agent
Navigate to:
All → Now Assist AI Agents → AI Agents → search for Remediation tasks AI agent → open → edit the Instructions field.
|
Part |
What you can change |
|---|---|
|
Accept / Edit / Dismiss prompt |
The wording of the options presented after task suggestions are shown |
|
Task display format |
How suggested tasks are displayed when the user edits — currently uses a Name / Description format |
|
Acceptance message |
The confirmation message after tasks are created — currently tells the user to assign owners manually |
|
Dismissal message |
What the agent says when the user dismisses — currently tells the user to create tasks manually from the issue page |
5. Enriching the Search Query with Additional Issue Details
The search query is what drives semantic matching against historical issues. By default, the GetSimilarRecords tool searches using the current issue's short_description — and optionally includes description when the use_description input is enabled. Adding more context from the issue record — such as entity name, issue type, or custom fields — can improve the relevance of the historical matches returned.
Navigate to
All → Now Assist Skill Kit → Skill → search for Issue Action Plan Generation → open → Tools → open GetSimilarRecords tool → edit the script.
Current code — how the search query is built
var shortDescription = '';
var issueId = '';
if (issueGR.next()) {
shortDescription = issueGR.getValue("short_description");
if (inputs.use_description === 'true') {
shortDescription += "\n" + issueGR.getValue("description");
}
issueId = issueGR.getValue("sys_id");
}
var references = getReferences(issueId, shortDescription,
['short_description', 'description'], ['body']);
Modified code — enrich the query with entity and issue type
var shortDescription = '';
var issueId = '';
if (issueGR.next()) {
shortDescription = issueGR.getValue("short_description");
if (inputs.use_description === 'true') {
shortDescription += "\n" + issueGR.getValue("description");
}
// Enrich the search query with additional issue context
var entityName = issueGR.getDisplayValue("profile");
var issueTypeLabel = issueGR.getDisplayValue("issue_type");
if (entityName) {
shortDescription += "\nEntity: " + entityName;
}
if (issueTypeLabel) {
shortDescription += "\nIssue Type: " + issueTypeLabel;
}
issueId = issueGR.getValue("sys_id");
}
var references = getReferences(issueId, shortDescription,
['short_description', 'description'], ['body']);
What you can safely change
- Add any field from the sn_grc_issue record to the query string — use getDisplayValue() for reference fields to include the display name rather than the sys_id
- Include custom fields your organization has added to the issue table
- Adjust or remove the \n separator — the semantic search treats the entire string as context, so the separator style has minimal impact
6. Adding Dynamic Filters to RAG Search
By default, the RAG search returns all closed issues that match the search profile's static filter (state=3^ORstate=4). If your organization needs to narrow results further at runtime — for example, to only compare against issues of the same type or issues related to a specific entity — you can add dynamic filter conditions using the .condition() method on the RAGRetrievalSource.
Navigate to
All → Now Assist Skill Kit → Skill → search for Issue Action Plan Generation → open → Tools → open GetSimilarRecords tool → edit the script.
Current code — source construction without dynamic filters
request.sources([
new sn_search.RAGRetrievalSource()
.sourceTableName('sn_grc_issue')
.fields(FIELDS)
.filter(true)
]);
Example — filter by the current issue's type
This restricts results to historical issues that share the same issue type as the current issue. You will need to pass the issue type value into the getReferences function.
Step 1: Extract the issue type and pass it to getReferences
// After querying issueGR, extract the issue type
var issueType = issueGR.getValue("issue_type");
// Pass it as an additional parameter
var references = getReferences(issueId, shortDescription,
['short_description', 'description'], ['body'], issueType);
Step 2: Update the getReferences function signature and source construction
function getReferences(issueId, shortDescription, fields, indexNames, issueType) {
...
request.sources([
new sn_search.RAGRetrievalSource()
.sourceTableName('sn_grc_issue')
.fields(FIELDS)
.filter(true)
.condition('issue_type=' + issueType)
]);
...
}
Example — filter by a fixed set of issue types
To restrict results to a specific group of issue types (for example, all control-related types), use an encoded query with ^OR conditions:
request.sources([
new sn_search.RAGRetrievalSource()
.sourceTableName('sn_grc_issue')
.fields(FIELDS)
.filter(true)
.condition('issue_type=7^ORissue_type=8^ORissue_type=9^ORissue_type=11')
]);
Example — filter by entity
To only match historical issues from the same entity as the current issue:
var entitySysId = issueGR.getValue("profile");
request.sources([
new sn_search.RAGRetrievalSource()
.sourceTableName('sn_grc_issue')
.fields(FIELDS)
.filter(true)
.condition('profile=' + entitySysId)
]);
Example — combine multiple filter conditions
Dynamic conditions follow GlideRecord encoded query syntax. You can combine conditions using ^ (AND) and ^OR (OR):
var entitySysId = issueGR.getValue("profile");
var issueType = issueGR.getValue("issue_type");
request.sources([
new sn_search.RAGRetrievalSource()
.sourceTableName('sn_grc_issue')
.fields(FIELDS)
.filter(true)
.condition('profile=' + entitySysId + '^issue_type=' + issueType)
.validateWithSearchProfile(true)
]);
.condition() method adds a runtime filter on top of the search profile's existing filter. The static filter (state=3^ORstate=4) still applies — your dynamic condition narrows results further, it does not replace the base filter. Adding .validateWithSearchProfile(true) ensures the source configuration is validated against the search profile.What you can safely change
- Filter by any field on the sn_grc_issue table — common choices include issue_type, profile (entity), and assigned_to
- Use hardcoded values for fixed filters (e.g., always restrict to control-related issues) or dynamic values from the current issue record for contextual filters
- Combine multiple conditions using encoded query syntax (^ for AND, ^OR for OR)
Tradeoffs at a Glance
|
Customization |
Benefit |
Risk |
|---|---|---|
|
Broaden issue index filter |
More reference data available for plan generation |
Issues without action plans are excluded at runtime — adding low-quality data doesn't help |
|
Lower similarity threshold |
More past issues included; helpful on sparse datasets |
Loosely related patterns may dilute the generated plan |
|
Raise similarity threshold |
Only closely matched issues inform the plan |
May return no results for new or uncommon issue types |
|
Increase match count |
Richer synthesis from more historical context |
Larger prompt sent to LLM; slower response time |
|
Edit agent instructions |
Tailor messages and flow to your org's tone and process |
LLM is sensitive to wording — test thoroughly before deploying |
|
Enrich search query |
More relevant historical matches when additional context is included |
Overly long queries may dilute semantic matching precision |
|
Add dynamic filters |
Results restricted to a comparable subset of historical issues |
Overly narrow filters may return too few matches on sparse datasets |
⚠ Important: Testing and Upgrade Compatibility
All customizations described in this article modify components that are part of the out-of-the-box use case delivery. Before deploying any change to production, test thoroughly in a sub-production instance using a representative set of real issue records — including issues with rich action plans, sparse action plans, and no historical matches.
When ServiceNow ships upgrades to this use case — including changes to skill prompts, agent instructions, datasource scripts, or indexed source configuration — those updates will reflect the out-of-the-box defaults. Any customizations you have made will need to be reviewed, reconciled, and re-applied against the upgraded version. This includes:
- Filter changes on the GRC issues indexed source
- Threshold and match count changes in the GetSimilarRecords tool script
- Search query enrichments in the GetSimilarRecords tool script
- Dynamic filter conditions in the GetSimilarRecords tool script
- Instruction changes in either agent
Maintain clear documentation of every customization your organization applies, including the original out-of-the-box value and the reason for the change, to make upgrade reconciliation straightforward.
- 365 Views
