- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-28-2019 04:00 AM
Hello everyone,
I am trying to locate my robots.txt in order to edit it to allow search engine bots to index my page.
When I am typing https://myinstance.service-now.com/robots.txt I am seeing the page where this is written :
User-agent: *
Disallow: /
________________________________________________________________________________________________
All I am trying to do is editing the Disallow part to Allow, doing that will give permission to search engine bots to index my page using a Search Engine but I couldn't locate where it is. In docs I found a information about Geneva version saying that you can go to Custom Search Integration> Robots.txt but in Madrid version there is no. Any kind of help will be useful thank you very much.
Solved! Go to Solution.
- Labels:
-
Marketing Service Management

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-28-2019 04:23 AM
Hi Yagiz,
- Go to Custom Search Integration > Robots.txt Definitions.
- Click New.
- Enter the contents of the robots.txt file in the Text field.
If you want this file to be active, select the Active field. There can only be one robots.txt file active at any time. Setting a file to active automatically sets all other files to inactive. Web crawlers will recognize the contents of the active robots.txt file and honor the robots exclusion protocol.
- Click Submit.
Also refer below link,
If my reply helps you at all, I’d really appreciate it if you click the Helpful button and if my reply is the answer you were looking for, it would be awesome if you could click both the Helpful and Accepted Solution buttons! 🙂
Regards,
Pratiksha

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-28-2019 04:23 AM
Hi Yagiz,
- Go to Custom Search Integration > Robots.txt Definitions.
- Click New.
- Enter the contents of the robots.txt file in the Text field.
If you want this file to be active, select the Active field. There can only be one robots.txt file active at any time. Setting a file to active automatically sets all other files to inactive. Web crawlers will recognize the contents of the active robots.txt file and honor the robots exclusion protocol.
- Click Submit.
Also refer below link,
If my reply helps you at all, I’d really appreciate it if you click the Helpful button and if my reply is the answer you were looking for, it would be awesome if you could click both the Helpful and Accepted Solution buttons! 🙂
Regards,
Pratiksha
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-04-2019 05:01 AM
Hi Pratiksha,
Regarding your answer, I would like to thank you first. But when it comes to functionality I can tell that this doesn't work on my instance.
I have successfully managed to activate Custom Search Integration from HI Portal, but when it comes to defining a new robots.txt for my instance it seems it doesn't work.
What I am doing is navigating to Custom Search Integration>Robots.txt Definitions I am adding a new definition here but whenever I type instancename.service-now.com/robots.txt the output is still;
User-agent: * Disallow: /

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-14-2022 05:48 AM
How did you activate the Custom Search Integration from HI? Was it a plugin?