The most effective and easiest tool for preventing Google from indexing certain web pages is the “noindex” metatag. There are four ways to de-index web pages from search engines: a “noindex” metatag, an X-Robots-Tag, a robots.txt file, and through Google Webmaster Tools. How to Prevent Google from Indexing Certain Web Pages Prior to de-indexing, it’s important to conduct a thorough content audit of your website so you have a systematic approach in determining which pages to include and exclude. Low-value pages (e.g., outdated content from years back, but something valuable enough not to be deleted from your website).Duplicate pages (e.g., similar content posted across multiple websites owned by one company).Typically, these include, but are not limited to, the following: Web Pages That Don’t Need to Be IndexedĪs mentioned, not all pages in your website need to be indexed by search engines. One study found that organic search traffic increased by 22% after removing duplicate web pages, while Moz reported a 13.7% increase in organic search traffic after removing low-value pages. Oftentimes, it is necessary to deliberately prevent search engines from indexing certain pages from your website to boost SEO. However, that may not always be the case. The common misconception is that doing so could result in better SEO rankings. In search engine optimization, the typical goal is to get as many pages in your website indexed and crawled by search engines like Google.