Need help? Check out our Support site, then

Googlebot Site Crawling Issue - (robots.txt)

  1. Happy Holidays to all!

    After a year of good crawling with no apparent issues,
    I decided I needed to get away from my blog for awhile
    and turned it all off as best I could without deleting it

    I set my public visibility to private and blocked search
    engines from indexing my site -- amongst other things.

    I've spent the last day attempting to get it all back up
    online, with public visibility and bot crawling.

    But Google says: "Googlebot is blocked from" and
    the current robots.txt file shows nothing but
    "Disallow" all the way down the page through
    like 12 line items: disallow everything.

    How can I fix this? Thanks.

    The blog I need help with is

  2. Hey, this is what Google says:

    Googlebot is blocked from

    Blocked by line 3: Disallow: /
    Detected as a directory; specific files may have different restrictions

  3. Thanks.

    Ya'll do this to me every friggin' time.



Topic Closed

This topic has been closed to new replies.

About this Topic