Googlebot Site Crawling Issue – (robots.txt)

  • Author
  • #1099236

    Happy Holidays to all!

    After a year of good crawling with no apparent issues,
    I decided I needed to get away from my blog for awhile
    and turned it all off as best I could without deleting it

    I set my public visibility to private and blocked search
    engines from indexing my site — amongst other things.

    I’ve spent the last day attempting to get it all back up
    online, with public visibility and bot crawling.

    But Google says: “Googlebot is blocked from” and
    the current robots.txt file shows nothing but
    “Disallow” all the way down the page through
    like 12 line items: disallow everything.

    How can I fix this? Thanks.

    The blog I need help with is


    Hey, this is what Google says:

    Googlebot is blocked from

    Blocked by line 3: Disallow: /
    Detected as a directory; specific files may have different restrictions



    Ya’ll do this to me every friggin’ time.



The topic ‘Googlebot Site Crawling Issue – (robots.txt)’ is closed to new replies.