Are You Killing Your Own WordPress SEO?

On a WordPress site of mine, I was perplexed as to why it was not getting indexed in Google or other search engines.

I verified my site in Google Webmaster Tools and tried to submit a sitemap. In the webmaster tools panel, it informed me that my robots.txt file was blocking Google from grabbing my sitemap.

This confused me, as I know I would never knowingly make a robots.txt file to block search engines.

Read on to find out what the problem was, how to fix it, and how it happened in the first place…

The Problem

I checked out my robots.txt file and saw the following:

User-agent: *
Disallow: /

As if disallowing all bots wasn’t enough, I saw all the pages of my site had the following meta tag in it.

<meta name='robots' content='noindex,nofollow' />

Okay, so now my SEO was killed on two different levels – search engines can’t index or follow links, if they could somehow first get past the robots.txt.

The Fix

How did I remedy this crucial mistake? It’s a simple fix.

WordPress SEO Visible

I navigated to my privacy settings in the WordPress administration panel and selected I would like my blog to be visible to everyone, including search engines (like Google, Sphere, Technorati) and archivers.

Prevention

When installing WordPress, make sure you don’t deselect the checkbox that reads: Allow my blog to appear in search engines like Google and Technorati.

Conclusion

You should always verify your site(s) in Google Webmaster Tools to see if there are any problems with indexing. I hope you found this quick WordPress tip informative, because although this is a simple mistake, it’s something to be aware of when SEO’ing WordPress sites.

IsItWp.Com

About the editorial team

Editorial Team at IsItWp is a team of WordPress experts led by Syed Balkhi.