How to Prevent or Enable Indexing
Site indexing allows a website to be visible in search engine results. A site can also be de-indexed (removed from search results) with or without the site owner's knowledge.
Block a Page from Indexing
- 1
- To block a page from being indexed, you simply need to add a metatag to the header of the page with the noindex directive, like so:
- 2
- This directive can also be added to an HTTP header:
- 1
- Fixing a page that is blocked from indexing can be as simple as removing the noindex directive from the page's metatag or HTTP header (referrenced above) .
- 2
- You may also need to edit the website's sitemap.
<meta name=“robots” content=“noindex”>
HTTP/1.1 200 OK (…) X-Robots-Tag: noindex (…)<br>
Enable a Webpage to be Indexed
Note: A noindex directive will not be effective if the page is blocked by a robots.txt file. If it's included in a robots.txt, then the crawler will not see the noindex directive and that the page can appear in search results by way of a link.
The webpage can still be de-indexed by Google or other search engines if the page is found to have low quality content.