When you start building a new website you really want to prevent Google from¬†indexing it. Once Google has discovered a certain url it will try to crawl that url for years. So if you change the url structure or relocate some articles you are asking for trouble. Therefore it is best to “forbid” search engines from indexing your site until you are satisfied with its content and structure.

Disable indexing: Change WordPress Search Engine Visibility setting

Log onto your administrator panel (/wp-admin). Now go to Settings>Reading. Here you will find a setting called¬† “Search Engine Visibility”. It states “Discourage search engines from indexing this site”.


Enabling this setting will add the following code to all pages

<meta name=”robots” content=”noindex,follow”/>

Basically this tells Google and other search engines to NOT index your website.

Disable indexing: Add a robots.txt file

If you really want to be certain your site will not be indexed you can block it completely with a robots.txt. You can setup the robots.txt file to tell search engines to NOT visit your website. Let’s get started…

First create an empty .txt file (in Windows this can be done from the right mouse menu in any folder). Now add the following code and save it as robots.txt:

User-agent: *

Disallow: /

Now to upload this file to the root directory of your website (usually public_html). You can do this with your favorite FTP software. If you are unsure how to do this then you can checkout this tutorial.


By adding noindex and a robots.txt file to your website you have ensured that your website will not be indexed by Google. Obviously I recommend removing robots.txt and disabling the option when your site is finished.

Questions? Feel free to ask them in the comments.