Need help? Check out our Support site, then


Googlebot Site Crawling Issue - (robots.txt)

  1. Happy Holidays to all!

    After a year of good crawling with no apparent issues,
    I decided I needed to get away from my blog for awhile
    and turned it all off as best I could without deleting it
    entirely.

    I set my public visibility to private and blocked search
    engines from indexing my site -- amongst other things.

    I've spent the last day attempting to get it all back up
    online, with public visibility and bot crawling.

    But Google says: "Googlebot is blocked from http://thedirtylowdown.wordpress.com/" and
    the current robots.txt file shows nothing but
    "Disallow" all the way down the page through
    like 12 line items: disallow everything.

    How can I fix this? Thanks.

    The blog I need help with is thedirtylowdown.wordpress.com.

  2. Hey, this is what Google says:
    _______________________________

    Googlebot is blocked from http://thedirtylowdown.wordpress.com/

    Blocked by line 3: Disallow: /
    Detected as a directory; specific files may have different restrictions

  3. Thanks.

    Ya'll do this to me every friggin' time.

    Every...

    Time.

Topic Closed

This topic has been closed to new replies.

About this Topic