- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
on 01-25-2022 04:01 AM
This article will help you enable and configure Google custom search integration in order to make your ServiceNow public articles accessible from external search engines.
1. Request Google custom search integration plugin
Currently, this plugin is only available by request.
There are two ways to request a plugin:
- Access the Now Support Service Catalog directly by clicking Service Catalog > Activate Plugin on Now Support.
- Access the Now Support Service Catalog through the All Applications page on your instance by following these steps.
For more information visit Request Google custom search integration | ServiceNow Docs
2. Create a Dynamic Sitemap
A sitemap is a record where you put information about the site. In this case, it will contain the links to your public KB articles. To create a dynamic sitemap we will be using ServiceNow Processors that provide a customizable URL endpoint that can execute a server-side JavaScript code and produce output such as TEXT, JSON, XML, or HTML.
Please note: Creating custom processors is deprecated now but it can be leveraged. In the future, they might come up with a different approach to creating processors.
In order to create one:
Navigate to System Definition > Processors
Click New
Most probably you will need to modify this ACL to Enable Admin overrides:
https://<yourinstance>.service-now.com/sys_security_acl.do?sys_id=8bc9bc150a0a0b4401a2b8324296ab12
Ensure that public role is selected in the Roles field
Enter this in your script field - modify as needed.
(function process(g_request, g_response, g_processor) {
// Get the instance name for dynamic generation of URLs
var instanceName = gs.getProperty("instance_name");
// Enter static/manual URLs here, and the sitemap header
// In this example, we're indexing the default service
// portal home, service catalog and community home
var xmlString = '<?xml version="1.0" encoding="UTF-8"?>';
xmlString = xmlString + '<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">' +
'<url>' +
' <loc>https://' + instanceName + '.service-now.com/sp</loc>\n' +
'</url>' +
'<url>' +
' <loc>https://' + instanceName + '.service-now.com/sp?id=kb_home</loc>' +
'</url>' +
'<url>' +
' <loc>https://' + instanceName + '.service-now.com/sp?id=community_home</loc>' +
'</url>';
// -- Dynamic URLs --
// Knowledge Articles - Public KB Articles
var grKK = new GlideRecord('kb_knowledge');
grKK.addEncodedQuery("workflow_state=published");
grKK.query();
while (grKK.next()) {
xmlString = xmlString +
'<url>' +
' <loc>https://' + instanceName + '.service-now.com/sp?id=kb_article&sys_id=' + grKK.getValue('sys_id') + '</loc>' +
' <lastmod>' + grKK.getValue('sys_updated_on').split(' ')[0] + '</lastmod>' +
'</url>';
}
// Community Blogs
var grSCB = new GlideRecord('sn_communities_blog');
grSCB.query();
while (grSCB.next()) {
xmlString = xmlString +
'<url>' +
' <loc>https://' + instanceName + '.service-now.com/sp?id=community_blog&sys_id=' + grSCB.getValue('sys_id') + '</loc>' +
' <lastmod>' + grSCB.getValue('sys_updated_on').split(' ')[0] + '</lastmod>' +
'</url>';
}
// Community Videos
var grSCVAC = new GlideRecord('sn_communities_video_as_content');
grSCVAC.query();
while (grSCVAC.next()) {
xmlString = xmlString +
'<url>' +
' <loc>https://' + instanceName + '.service-now.com/sp?id=community_video&sys_id=' + grSCVAC.getValue('sys_id') + '</loc>' +
' <lastmod>' + grSCVAC.getValue('sys_updated_on').split(' ')[0] + '</lastmod>' +
'</url>';
}
// Community Questions
var grKSQQ = new GlideRecord('kb_social_qa_question');
grKSQQ.query();
while (grKSQQ.next()) {
xmlString = xmlString +
'<url>' +
' <loc>https://' + instanceName + '.service-now.com/sp?id=community_question&sys_id=' + grKSQQ.getValue('sys_id') + '</loc>' +
' <lastmod>' + grKSQQ.getValue('sys_updated_on').split(' ')[0] + '</lastmod>' +
'</url>';
}
// -- End Dynamic URLs --
// Close the XML document root tag
xmlString = xmlString + '</urlset>';
// Convert the XML string into an XMLDocument
var xmldoc = new XMLDocument(xmlString);
// Write the XML document to output in text/xml format
g_processor.writeOutput("text/xml", xmldoc);
})(g_request, g_response, g_processor);
Click Submit
Now create a Public Page
Type sys_public.list and press enter in the application navigator
Click New
Enter the Page value as sitemap (Path of the processor) and Submit
For more information visit https://www.kevin-custer.com/blog/create-a-dynamic-sitemap-in-service-now
3. Create/Update Robots.txt file
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site.
To create one:
Navigate to Custom Search Integration > Robots.txt Definitions
Click New
Add this in your Text field - modify as needed.
User-agent: *
Allow: /sp
Allow: /sitemap.do
Sitemap: https://<yourInstance>.service-now.com/sitemap.do
This means all user agents are allowed to crawl the entire site under /sp and sitemap.do
For more information visit https://developers.google.com/search/docs/advanced/robots/create-robots-txt
4. Add SEO information to Service Portal pages
SEO information is necessary in order for your ServiceNow pages to rank higher in search results.
For more information visit https://docs.servicenow.com/bundle/quebec-servicenow-platform/page/build/service-portal/concept/seo-...
5. Use search engines' search console tools to confirm and monitor that web crawlers can find and crawl your site
To use Google Search Console visit https://search.google.com/search-console/about and Sign up
After logging in:
5.1 Add property and verify your site ownership
You can use HTML tag method to verify your site
In order to do that, first navigate to Service Portal > Pages
Open your homepage/index page - ensure that the Public checkbox is checked
Under Related Lists, click on the Meta Tags section
Create a Meta Tag similar to this image - replace the content value with your own verification code
5.2 Add a new sitemap
Use your public sitemap.do link which was added in your Robots.txt file
After successful submission, you'll need to wait till Google indexes your articles and they start appearing in the Google search results. In my case, it took a couple of days for articles to appear on the 2nd/3rd page of Google search results. The timeframe may vary in your case.
For more information visit https://developers.google.com/search/docs/advanced/crawling/overview
- 4,779 Views
