robots.txt / Search Indexing

Telling Search Engines What to Index and Omit

MODX Cloud allows you to manage robots.txt files per hostname. If you are experiencing unexpected results, make sure it is enabled for the problem URL.

MODX Cloud will always serve robots.txt instructions that exclude sites from being indexed and spidered on internal MODX Cloud URLs—those 4-digit URLs starting with cXXXX like c9999.paas7.tor.modxcloud.com. This cannot be disabled, and there is no physical robots.txt file on the filesystem that can be removed. If you upload a custom robots.txt file, MODX Cloud will still serve deny-all instructions to robots/search spiders.

For your own internal sites URLs, like sitename.your-account.modxcloud.com, again MODX Cloud defaults to denying robot indexing. You can, however, enable search engine indexing and serving custom robots.txt file if you upload one to your site root.

When you add a custom URL to a website in MODX Cloud, MODX Cloud defaults to enabling search spiders and custom robots.txt files. You can elect to disable this if your situation warrants it.

Learn more about robots.txt files and their use at robotstxt.org.

Enable Serving a Custom robots.txt

To turn on custom robots.txt serving for one or more hostnames, which allows your site to be indexed by search engines, toggle the option from the Web Server tab of the Cloud overview.

Hostnames and robots.txt in MODX Cloud

The MODX Cloud Dashboard by default prevents multiple hostnames from being indexed and spidered. When you add a custom domain name to a MODX Cloud project, you can set which URLs you wish to have the custom robots.txt file served.

If you need to edit this later, do so from the "Web Server" tab of the Cloud Overview page.

Assign custom robots.txt permissions

Add a Custom robots.txt File with MODX

  1. Login to the Manager of your project.

  2. From the Manager Tree Menu, use the files tab to show the Filesystem Media Source.

  3. Right-click on the "Filesystem" header and choose "Create File".

  4. Create a file named "robots.txt" with whichever rules you would like.

You can also SSH or use SFTP to connect to your project and upload from there.

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.