Haloweb

Well-Known Member
Jul 2, 2004
88
0
156
Hi there

I was busy with google sitemaps, tried to verify them and get the following error regarding
my server

"We've detected that your 404 (file not found) error page returns a status of 200 (OK) in the header."

does anyone have a idea how to correct this ?

thanks for your help in advance
 

Haloweb

Well-Known Member
Jul 2, 2004
88
0
156
Thanks Andy

But google actually needs it to give a 404 instead of 200 ok , its a requirement for google sitemaps which I have trying to work with for thier verification process
 

Izzee

Well-Known Member
Feb 6, 2004
469
0
166
Google gobbledygook! :rolleyes:

The file you have is probably named 404.shtml which when you call it from your browser it will always give a 200 ok in the header. There is no way of altering that. It is perfectly normal. The file is presented to a browser that calls a file that does not exist.

Try and rename the file to anything you like so you don't have a 404.shtml file and see if Google gives you a '404 file not found error' in the header instead of '200 OK'. That is also very normal.

So I don't know what google expects unless they are reinventing the html language and the html error codes. Of course, one should expect anything from Google :rolleyes:
:)
 
Last edited:

Haloweb

Well-Known Member
Jul 2, 2004
88
0
156
hehe :)

well I had actually removed the shtml file totally but was still getting the
same error from google, perhaps they are changing html after all hehe ;)

I just thought there was somthing in the httpd.conf I was missing.

Thanks for your replies
 

Izzee

Well-Known Member
Feb 6, 2004
469
0
166
That's good you removed it or them if you have more than the 404 server side error pages. They should not be included in a site map period. Perhaps thats what balked Google or perhaps they don't clear their browser cache :D

I would like to add here that Google bots will sniff every directory and file in your domain as it has no way of knowing what not to sniff unless you tell it.

So in your site map you put all the files that are for public consumption so the Google site map brain dead bot will know what to sniff.

Conversely those files (including your html error code files) and the directories you don't want in the public arena should be included in the robots exclusion file, robots.txt, so that Google's other brain dead bot will know to leave them alone.

Betwixt them both and I am sure they will compare notes, the bots will have it the way you want it and not the way they and Google want it. This applies to all the good guy bots that use the robots.txt file.
:)
 
Last edited: