cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Highlighted
New

404 Error when Googlebot indexing


I was wondering why http://wiki.ironchariots.org rarely or never appears in google searches. I enabled the google webmaster console with the wiki site and it shows googlebot has many 404 errors. Occasionally pages are correctly retrieved but the majority seem to fail, even though they definitely exist. I also tried fetching pages with http://web-sniffer.net/ which works with the default user agent, but fail with 404 if the user agent is googlebot. It appears as if googlebot accessing the wiki is blocked by godaddy.

 

I first suspected the request rate limits set by godaddy were the problem. For that reason, I reduced the crawl rate to the minimum allowed by google. However, this has not improved the situation.

 

This problem is strikingly similar to https://uk.godaddy.com/community/Managing-Web-Hosting/Temporory-Unreachable-in-google-fetch/m-p/3185...

 

Thoughts anyone?

3 REPLIES 3
Highlighted
Retired
Not applicable

Re: 404 Error when Googlebot indexing

@TimSC,

 

There are arguments for, and against what you say.........

 

But I found no problem finding your site.......... though that's 5 minutes I'll never get back Smiley Happy

Highlighted
New

Re: 404 Error when Googlebot indexing

I can only see http://www.ironchariots.org/ in search results, not http://wiki.ironchariots.org/ Also there are several hundred articles on the site and google reports 15 pages in its index. There does seem to be a problem... if people are serious about looking at this problem, at least try http://web-sniffer.net/ with different user agents.

 

Is there any information about request rate limits for godaddy?

Highlighted
Retired
Not applicable

Re: 404 Error when Googlebot indexing