My site went from #1 in SEO ranking in several key word searches (a ranking it has had since I built it in November) to nonexistent in google searches. I am getting an error message that googlebot can’t access my site. This has been going on since March 18th. Anyone else having these problems? Two issues happened around this time. My host installed a firewall and our third party online reservation system was upgraded. Could one of these plugins be the culprit?
site is http://www.sanjuanski.com
Error message I am receiving is:
Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.
Search engines will see two different sites I think for you – a couple of options – use a WordPress.ORG install to drive your whole site – or map your blog here as a sub-domain of your real html site –
Either way more content for search engines to see under the same domain name
The topic ‘Googlebot can not longer access site’ is closed to new replies.