Where is the Robots.txt?

Yagiz
Kilo Contributor

Hello everyone,

I am trying to locate my robots.txt in order to edit it to allow search engine bots to index my page.

When I am typing https://myinstance.service-now.com/robots.txt I am seeing the page where this is written : 

User-agent: *

Disallow: /

________________________________________________________________________________________________

 

All I am trying to do is editing the Disallow part to Allow, doing that will give permission to search engine bots to index my page using a Search Engine but I couldn't locate where it is. In docs I found a information about Geneva version saying that you can go to Custom Search Integration> Robots.txt but in Madrid version there is no. Any kind of help will be useful thank you very much.

1 ACCEPTED SOLUTION

Pratiksha Kalam
Kilo Sage

Hi Yagiz,

  1. Go to Custom Search Integration > Robots.txt Definitions.
    Robots files
  2. Click New.
  3. Enter the contents of the robots.txt file in the Text field.
    If you want this file to be active, select the Active field. There can only be one robots.txt file active at any time. Setting a file to active automatically sets all other files to inactive. Web crawlers will recognize the contents of the active robots.txt file and honor the robots exclusion protocol.
    Robots file
  4. Click Submit.

Also refer below link,

https://community.servicenow.com/community?id=community_question&sys_id=07940fa9dbd8dbc01dcaf3231f96...

 

If my reply helps you at all, I’d really appreciate it if you click the Helpful button and if my reply is the answer you were looking for, it would be awesome if you could click both the Helpful and Accepted Solution buttons! 🙂


Regards,
Pratiksha

View solution in original post

3 REPLIES 3

Pratiksha Kalam
Kilo Sage

Hi Yagiz,

  1. Go to Custom Search Integration > Robots.txt Definitions.
    Robots files
  2. Click New.
  3. Enter the contents of the robots.txt file in the Text field.
    If you want this file to be active, select the Active field. There can only be one robots.txt file active at any time. Setting a file to active automatically sets all other files to inactive. Web crawlers will recognize the contents of the active robots.txt file and honor the robots exclusion protocol.
    Robots file
  4. Click Submit.

Also refer below link,

https://community.servicenow.com/community?id=community_question&sys_id=07940fa9dbd8dbc01dcaf3231f96...

 

If my reply helps you at all, I’d really appreciate it if you click the Helpful button and if my reply is the answer you were looking for, it would be awesome if you could click both the Helpful and Accepted Solution buttons! 🙂


Regards,
Pratiksha

Yagiz
Kilo Contributor

Hi Pratiksha,

 

Regarding your answer, I would like to thank you first. But when it comes to functionality I can tell that this doesn't work on my instance.

 

I have successfully managed to activate Custom Search Integration from HI Portal, but when it comes to defining a new robots.txt for my instance it seems it doesn't work. 

 

What I am doing is navigating to Custom Search Integration>Robots.txt Definitions I am adding a new definition here but whenever I type instancename.service-now.com/robots.txt the output is still;

User-agent: *
Disallow: /
 ___________________________________________________________________________________
So I am in need of advanced help. Thank you!
 
 

How did you activate the Custom Search Integration from HI? Was it a plugin?