- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
08-01-2025 10:49 AM - edited 08-02-2025 09:38 AM
By Sunil Bemarkar, Sr. Partner Solutions Architect – AWS,
Ashley Snyder, Principal Outbound Product Manager - ServiceNow
Overview
ServiceNow customers can now power Now Assist with Amazon Bedrock in two distinct ways:
- Customer-managed Amazon Bedrock: Connect any large-language or multimodal model that is hosted in your AWS account and connected via the Generative AI Controller and Amazon Bedrock Spoke. Supported for custom skills and select out-of-the-box Now Assist skills.
- ServiceNow-managed Anthropic Claude via Amazon Bedrock: simply toggle an option in Now Assist to use Anthropic Claude hosted in ServiceNow’s own AWS tenancy, with Bedrock usage costs covered by ServiceNow via assist consumption. Support for most out-of-the-box Now Assist skills.
Both paths are secure, compliant, and serverless by design, but they optimize for different organizational priorities around flexibility, governance, and time-to-value. This post explains the architecture, compares the two models, and outlines steps to get started.
The challenge
Enterprise service teams want to infuse generative AI into incident resolution, knowledge management, employee experiences and other workflows without spawning governance gaps or new budget lines. They also vary widely in:
- Cloud maturity: Some already run hundreds of workloads in AWS and have fine-tuned models; others prefer a fully managed SaaS experience.
- Data-residency mandates: Regulated industries often require prompts to stay inside a designated AWS account or region.
- Cost controls: Finance teams need predictable billing models that map to existing ServiceNow SKUs or AWS consumption budgets.
Solution overview
Decision area |
Customer-managed Amazon Bedrock |
ServiceNow-managed Anthropic Claude on AWS |
Model catalog |
Any model exposed by Amazon Bedrock; (Claude, Amazon Titan, Meta Llama3, Mistral, NOVA, etc) |
Anthropic Sonnet 3.7 Claude |
Compute tenancy |
The customer’s AWS account |
ServiceNow’s managed AWS tenancy |
Who pays Amazon Bedrock |
Customer (pay-as-you-go or provisioned throughput on their AWS bill) |
ServiceNow (included with Now Assist entitlement) |
Setup effort |
Install the Amazon Bedrock Spoke, create IAM role, store credentials via Connections and Credentials in ServiceNow, and connect via the Generative AI Controller, choose a model when developing custom skills in Now Assist Skill Kit |
One-click enablement in Now Assist Admin console via skill groups or skills; no AWS credentials required |
Data residency |
Prompts and responses remain in the customer’s AWS account |
Governed by ServiceNow platform controls and data center regions |
Fine-tuning |
Supported for eligible Amazon Bedrock models |
Not available; completely managed by ServiceNow for performance |
Best fit |
Organizations needing model diversity, regional control, compliance or existing Amazon Bedrock spend |
Teams seeking the fastest adoption, zero AWS setup, and predictable costs |
How it works
Customer-managed Amazon Bedrock (Bring-Your-Own-Model)
Architecture flow
- Admin installs the Amazon Bedrock Spoke alongside the Generative AI Controller on the ServiceNow instance.
- An AWS IAM role with bedrock:InvokeModel is created and exposed via a secure credential alias.
- When an end-user invokes a supported out-of-the-box Now Assist or custom skill, ServiceNow calls Amazon Bedrock directly in the customer’s AWS account.
- Amazon Bedrock returns the model inference, which Now Assist post-processes via the Generative AI Controller before displaying to the user.
Figure 1 - Customer-managed Amazon Bedrock integration
Why choose this path?
- Maximum model choice and the option to fine-tune privately hosted LLMs.
- Easy alignment with existing AWS Organizations, Service Control Policies, and Cost Explorer guardrails.
- Ideal for regulated workloads where data must never leave an enterprise-owned AWS account.
ServiceNow-managed Anthropic Claude on AWS via Amazon Bedrock
Architecture flow
- ServiceNow provisions a managed Amazon Bedrock alias for Anthropic Claude in its AWS environment.
- ServiceNow stores a tenant-isolated API key in ServiceNow’s secret manager; customers never manage credentials.
- Now Assist routes prompts to the managed Anthropic Claude on AWS endpoint for prompt and response inference.
- All Amazon Bedrock charges are managed by ServiceNow, and customers pay in Now Assist assist consumption.
Figure 2 - ServiceNow-managed Anthropic Claude on AWS via Amazon Bedrock integration
Why choose this path?
- Minutes to activate; no AWS account, IAM, or network configuration.
- Predictable costs are included with Now Assist licensing, based on assist usage.
- Built-in responsible AI guardrails and performance protections, managed entirely by ServiceNow.
Customer spotlight – “Regional Bank”
A large regional bank runs over 800 micro-services in AWS and must keep regulated data within its EU region. For its customer-facing chatbots it selected customer-managed Bedrock and fine-tuned an Anthropic Claude 3.5 Haiku model on anonymized call transcripts. Internally, the regional bank’s HR team wanted rapid generative knowledge articles with no AWS onboarding, so they enabled ServiceNow-managed Anthropic Claude on AWS for Now Assist. The dual approach gave the regional bank data-sovereignty where needed and instant productivity elsewhere, with a single AI skill kit across ServiceNow workflows.
Getting started
- Evaluate requirements
- Identify use-case sensitivity, regional constraints, and budget ownership.
- Pilot the right option
- Customer-managed Amazon Bedrock: Follow the Amazon Bedrock Spoke setup guide
- ServiceNow-managed Anthropic Claude on AWS: Upgrade to Yokohama Patch 6, update Now Assist applications, and configure model provider policies in AI Control Tower and the Now Assist Admin console.
- Define guardrails
- Customer-managed Amazon Bedrock: Configure Amazon Bedrock guardrail templates or configure Now Assst Guardian guardrails for logging or blocking prompt injection or offensive content detection. Configure Data Privacy in Now Assist to mask sensitive data during GenAI inference.
- ServiceNow-managed Anthropic Cluade on AWS: Configure Now Assst Guardian guardrails for logging or blocking prompt injection or offensive content detection. Configure Data Privacy in Now Assist to mask sensitive data during GenAI inference.
- Measure impact
- Customer-managed Amazon Bedrock: View metrics on AI Control Tower and Now Assist Admin console via Now Assist Analytics and AWS CloudWatch metrics to baseline TTR (time-to-resolution) and knowledge creation velocity.
- ServiceNow-managed Anthropic Claude on AWS: View metrics on AI Control Tower and Now Assist Admin console via Now Assist Analytics.
- Scale organization-wide
- Customer-managed Amazon Bedrock: Roll out supported out-of-the-box Now Assist skills or custom skills via UI Actions, Now Assist Panel, Flow Designer and Virtual Agent Designer; track adoption in AI Control Tower: Now Assist Admin console via Now Assist Analytics.
- ServiceNow-managed Anthropic Claude on AWS: Configure model provider for out-of-the-box skills and set skill availability and display options in the Now Assist Admin console, track adoption in AI Control Tower: Now Assist Admin console via Now Assist Analytics.
Conclusion
Whether you need full control over every token or zero-touch generative AI at SaaS speed, Now Assist with Amazon Bedrock offers a deployment model that meets you where you are on your AI journey. ServiceNow on AWS is available as a SaaS offering in AWS Marketplace.
Next steps
- Explore the Generative AI Controller documentation to connect your first Amazon Bedrock model.
- Contact your AWS Partner Solutions Architect or ServiceNow representative to enable ServiceNow-managed Anthropic Claude on AWS or view the product documentation to self-implement.
- Dive into the Now Assist Skill Kit to embed custom generative AI across ITSM, HR, and customer workflows.
Start building today and let your teams focus on innovation and not integration.