How to Prevent or Enable Indexing

Site indexing allows a website to be visible in search engine results. A site can also be de-indexed (removed from search results) with or without the site owner's knowledge.

Block a Page from Indexing

To block a page from being indexed, you simply need to add a metatag to the header of the page with the noindex directive, like so:
<meta name=“robots” content=“noindex”>
This directive can also be added to an HTTP header:
HTTP/1.1 200 OK
X-Robots-Tag: noindex

Enable a Webpage to be Indexed

Fixing a page that is blocked from indexing can be as simple as removing the noindex directive from the page's metatag or HTTP header (referrenced above) .
You may also need to edit the website's sitemap.
Note: A noindex directive will not be effective if the page is blocked by a robots.txt file. If it's included in a robots.txt, then the crawler will not see the noindex directive and that the page can appear in search results by way of a link.

The webpage can still be de-indexed by Google or other search engines if the page is found to have low quality content.

Still need help? Contact Us Contact Us