- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
07-31-2025 07:40 AM - edited 08-04-2025 11:52 AM
This article covers FAQs for our new model provider flexibility offering in the Yokohama Patch 6 release.
For more information on our latest offerings, see this article: Introducing Model Provider Flexibility: Optimize Every AI Workflow
For deeper dives on data handling, security, and responsible AI see this article.
FAQs
Q1: Why is a "one-size-fits-all" AI strategy no longer sufficient for enterprises?
A1: Enterprise AI has outpaced one-size-fits-all strategies because a single model cannot satisfy the diverse and unique demands of every business workflow. Different departments have distinct goals, workflows, and risk profiles, requiring tailored AI solutions.
Q2: What is "Model provider flexibility" and how does it benefit my business?
A2: Model provider flexibility allows you to align the right AI with the unique demands of every workflow within your business. This delivers smarter, safer outcomes tailored to specific business goals by optimizing AI performance, accelerating innovation, and ensuring deployment with confidence and control through built-in governance.
Q3: Which external AI models can I integrate with ServiceNow through this flexibility?
A3: You can integrate with trusted providers such as OpenAI (GPT-4.1-mini, GPT-4.1 via Azure OpenAI) , Google (Gemini 2.5 Flash and Pro) , and Anthropic (Claude 3.7 Sonnet via Amazon Bedrock).
Q4: How does ServiceNow handle the management of these integrated models? Do I need to manage APIs or contracts?
A4: ServiceNow manages the hosting, scaling, and integrations of these models, meaning you do not need to manage contracts, infrastructure, or APIs. This significantly simplifies your AI adoption and reduces overhead.
Q5: How does ServiceNow ensure the secure and compliant use of AI models?
A5: ServiceNow provides built-in governance that is native to the platform. AI Control Tower allows the definition of enterprise-wide AI policies for model provider availability, data routing, and fallback behavior. Data privacy and processing controls can be configured in the Now Assist Admin console, and Now Assist Guardian enforces guardrails for harmful and toxic content.
Q6: What is the pricing model for using different AI providers within ServiceNow?
A6: ServiceNow offers a simple and predictable licensing model where consumption is measured via "assists". You pay the same rate per assist regardless of which integrated model you choose, giving you the freedom to innovate without unpredictable costs.
Q7: How does ServiceNow ensure AI models speak my users' language, regardless of the model provider selected?
A7: ServiceNow combines access to powerful native model language capabilities from each provider with Now Assist Dynamic Translation. If a selected model does not natively support a specific language, the platform's translation service automatically and seamlessly steps in real-time, ensuring a consistent, high-quality global experience for your employees and customers.
Q8: Can you explain the underlying architecture for AI model integration and data handling?
A8: ServiceNow's AI platform has strategically expanded its AI portfolio, with all AI capabilities built in and natively re-platformed onto ServiceNow, utilizing the same architecture and data model as the entire platform. This deep platform integration optimizes for ServiceNow AI Platform workflows and data structures. For a detailed understanding of architecture and data flows, please consult our Support articles available on Now Support.
Q9: What are the main advantages of using ServiceNow's platform-native Now LLM Service?
A9: The Now LLM Service provides platform-native models that are purpose-built for your workflows with integrated governance and scale. Key strengths include stable, consistent performance, full integration with ServiceNow workflows, and compliance with data residency, security, and regulatory standards. They are ideal for accelerating routine tasks and streamlining complex workflows.
Q10: What are the key advantages of leveraging leading integrated model providers like Azure OpenAI, Google Gemini, and AWS Anthropic Claude through the ServiceNow AI Platform?
A10: By integrating these leading model providers directly into the ServiceNow AI Platform, you gain several key advantages:
-
Access to best-in-class AI: You can tap into the power of cutting-edge AI models for advanced reasoning, multimodal inputs, and complex task execution.
-
Seamless integration and managed overhead: ServiceNow manages all the complexity—including contracts, licensing, infrastructure, API keys, and model versioning—so you don't have to. This allows you to focus purely on driving business outcomes.
-
Centralized governance and control: All integrated models are governed through AI Control Tower, just like Now LLM, allowing you to set policies, monitor usage, and manage fallback logic from a single place.
-
Flexible and scalable AI: The platform supports enterprise growth across teams, needs, and outcomes, enabling experimentation and innovation without vendor lock-in.
-
Accelerated innovation: The simplified adoption and managed updates enable rapid experimentation and deployment of new AI skills without complex rewrites or vendor lock-in.
Q11: How does ServiceNow help me choose the right AI model for a specific business need?
A11: ServiceNow guides model selection by asking three key questions :
- What model strengths does the skill or Agentic workflow require (e.g., advanced reasoning, multi-turn Q&A, application generation)?
- What input type or context will it need (e.g., long documents, images, tools, code inputs)?
- Does the use case require tone sensitivity or regulated response (e.g., HR, legal, compliance-sensitive skills)?
Q12: How does ServiceNow help simplify the adoption of AI models, particularly for new integrations?
A12: ServiceNow simplifies AI adoption through fully managed, seamless integration via the ServiceNow AI Platform, eliminating infrastructure or API overhead. All configurations are done directly within the Now Assist Admin console, and governance is centralized through AI Control Tower.
Q13: What is the benefit of ServiceNow managing the full model lifecycle, including updates and deprecation?
A13: ServiceNow managing the full model lifecycle means that your teams are relieved of the burden of monitoring performance, managing version updates, re-evaluating prompts, and handling deprecation. This allows you to take advantage of new models without worrying about breaking your workflows or the associated operational overhead.
Q14: How does ServiceNow help ensure responsible AI use with its model provider flexibility?
A14: ServiceNow's platform is built on responsible AI principles and is aligned to global compliance standards. It offers robust governance and security controls throughout the lifecycle, with centralized control via AI Control Tower and data privacy features like Data Privacy for Now Assist and guardrails from Now Assist Guardian. This ensures that AI decisions are trusted, flexible, and governed by design.
Q15: How does the ServiceNow AI Platform ensure the security and privacy of my sensitive data when interacting with AI models, especially third-party providers?
A15:Data security and privacy are paramount. The ServiceNow AI Platform is designed with enterprise-grade security controls from the ground up. For integrated third-party models, we ensure that your data remains within our secure platform and is not used to train external models. Our AI Control Tower allows you to set granular data routing policies, enforcing compliance with data residency requirements and safeguarding sensitive information through features like Data Privacy for Now Assist, which includes data pattern anonymization. This means your data is protected and governed by your defined policies, giving you peace of mind.
See our Support articles and community article on Data Privacy for Now Assist for more information.
Q16: Given the rapid pace of innovation in AI, how does the ServiceNow AI Platform help me stay agile as AI technology evolves rapidly?
A16: Our model provider flexibility is designed to support your agility and adaptation. By offering a hybrid approach that integrates both our platform-native Now LLMs and leading external model providers, the platform helps you access evolving AI technology. ServiceNow continuously manages updates, versioning, and compatibility testing for these integrated models. This means your AI solutions can evolve as the technology does, significantly minimizing the need to re-engineer your core workflows or incur substantial refactoring efforts. You can confidently embrace new AI capabilities as they emerge, leveraging a platform built to support continuous innovation.
Q17: How does ServiceNow’s new integrated model provider approach differ from configuring AI spoke options in the Generative AI Controller?
A17: Configuring AI spokes does allow customers to select models beyond ServiceNow's integrated model providers, and you can configure BYO LLM for select out-of-the-box skills and custom skills. However, the integrated model provider approach significantly simplifies and enhances the experience for customers utilizing leading integrated models like Azure OpenAI, Google Gemini, and Anthropic Claude on AWS. While spokes are the underlying connections used by the Generative AI Controller to communicate with external generative AI services, our integrated approach goes a step further by abstracting away nearly all the management overhead for these specific providers.
With the integrated approach, ServiceNow handles all integrated provider contracts, licensing, infrastructure provisioning, and API key management on your behalf. You don't need to manage authentication, negotiate with vendors, or worry about model version updates and compatibility testing. This contrasts with the broader concept of spoke options, which, especially for custom spokes, can still require your teams to manage external infrastructure, API keys, and other credentials directly. Our integrated model provides flexibility, meaning you get the power of choice without the operational drag.
Q18: How can I upgrade my instance to use the new integrated model providers?
A18: Upgrade your instance to Yokohama Patch 6, and update your Now Assist apps such as Now Assist for ITSM, CSM, HRSD, etc.
Ensure the Now Assist Admin Console and Generative AI Controller apps are up to date as well if they were not automatically updated with your Now Assist apps.
- 1,514 Views
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Hey @Ashley Snyder ,
Could you please check? It seems this link redirects to the edit article page instead of the published version of the article.
Screenshot 1-
Screenshot 2-
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Thanks @Vivek Verma the link has been updated.