Robots.txt on Pressidium

We help maintain your SEO ranking during development.

Written by Admin account
Updated over a week ago

Google enforces a SEO penalty on duplicated content and URLs. Your WordPress site's staging and production environment are hosted under and with a mapped domain, so this could potentially create problems with SEO. 

In order to address this, we restrict all web crawlers from indexing anything that originates from * (This includes your staging). We also inject a X-Robots-Tag HTTP header in every response for extra measure. 

Only the WordPress site that is mapped on your domain (for example will get indexed, thus allowing you to  break things on staging, without any fear of having your SEO score impacted. 

Open a support request if you have any questions, or a particular use-case you would like us to discuss.  

Did this answer your question?