Googlebot Site Crawling Issue – (robots.txt)

  • Author
    Posts
  • #1099236

    Happy Holidays to all!

    After a year of good crawling with no apparent issues,
    I decided I needed to get away from my blog for awhile
    and turned it all off as best I could without deleting it
    entirely.

    I set my public visibility to private and blocked search
    engines from indexing my site — amongst other things.

    I’ve spent the last day attempting to get it all back up
    online, with public visibility and bot crawling.

    But Google says: “Googlebot is blocked from http://thedirtylowdown.wordpress.com/” and
    the current robots.txt file shows nothing but
    “Disallow” all the way down the page through
    like 12 line items: disallow everything.

    How can I fix this? Thanks.

    The blog I need help with is thedirtylowdown.wordpress.com.

    #1099344

    Hey, this is what Google says:
    _______________________________

    Googlebot is blocked from http://thedirtylowdown.wordpress.com/

    Blocked by line 3: Disallow: /
    Detected as a directory; specific files may have different restrictions

    #1099482

    Thanks.

    Ya’ll do this to me every friggin’ time.

    Every…

    Time.

The topic ‘Googlebot Site Crawling Issue – (robots.txt)’ is closed to new replies.