My site went from #1 in SEO ranking in several key word searches (a ranking it has had since I built it in November) to nonexistent in google searches. I am getting an error message that googlebot can't access my site. This has been going on since March 18th. Anyone else having these problems? Two issues happened around this time. My host installed a firewall and our third party online reservation system was upgraded. Could one of these plugins be the culprit?
site is http://www.sanjuanski.com
Error message I am receiving is:
Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.
Any advice would be greatly appreciated.
The blog I need help with is sanjuanski.wordpress.com.