Effective SEO for Test websites

Enterprises always create internal publicly accessible sub-domains for various reasons but mostly for testing. For example, Enterprises (for example, site.com) tend to create sub-domains like beta.site.com, test.site.com, demo.site.com or qa.site.com . The SEO’s should incorporate a few SEO checks before the company launches test sites or beta websites.

It is always recommend that you propose few SEO requirements for beta websites so that these practices doesn’t affect your main site’s SEO. The following are some of the reason they can affect your main websites SEO :

  • Duplicate content
  • Page rank sharing
  • excessive crawling / crawl activity
  • Incorrect indexation

These are some tips where you can control the access of web crawlers to these internal websites and not have any SEO issues

Implementing the correct blocking rules in robots.txt file.

I recommend you add the following rules in the sub-domains robot.txt file :

User-agent: *

Disallow: /

Implement Canonical tags

Make sure the pages under the sub-domain contains the canonical tag with the primary domain name. For example, the canonical tag in the page www.beta.site.com/test-url/ is www.site.com/test-url/ .  This way if the pages inside the beta site is crawled , they will index the correct URL while indexing.

Add no-follow, no index to internal pages

If the internal website have pages which is not present in the main website, add the “robots” meta tag blocking the Search engines. This way  they don’t get indexed .The tag is <meta name=”robots” content=”noindex, nofollow”> before the </head> tag.

Register the sub-domain in Google webmaster tools (GWT)

If you can register the sub-domain to GWT, you can find whether any crawling activity is happening inside the website.

Possibly Related Posts:


Tags: ,

Leave a Reply