Hi Deepsand. I think it was a Google issue more than anything else. I just got crawled on 12/15 a started showing up in the SERPS.
Thanks for your help
I am still seeing 301 and 406 headers when accessing the site via www and non-www respectively from any server.
Same results as posted here: http://www.webproworld.com/webmaster...l=1#post542340
savant: Email your hosts support department and tell them to run lynx on your site with a full head check like this:
lynx -head 'http://savantcreativegroup.com'
Ask them why in the heck your site is returning a 406 (not acceptable) header instead of the proper 200 (success) header.
I have a feeling that they are doing this based on UserAgent, and if so, Google very well may run into more issues 'at times' while re-spidering your site. Which is why I said correct them before they come back to bite you in the arse.
I.e., I can only duplicate the 406 when using "TRACE."The 406 error appears to have been an artifact of the LYNX Request Header having used a Request Method of "TRACE;" using "GET" and "HEAD" both yield a 200 response code. As the site has a Custom 406 page, the result of the 301 is the file named 406.shtml.
Last edited by deepsand; 12-18-2010 at 03:26 PM.
Nope, checked that the first time you said it, and set trace to off by default. I also just checked it just now with trace off with the same results. even setting it to force a get comes back 406. Not to mention if you actually look at the commandline I posted both times, I had lynx already set to force HEAD '-head'
I just now rechecked it on http://web-sniffer.net/ , using every User Agent provided for there, with the same results.
Twilight Zone? Or, Outer Limits?
Addendum: Did a check using www.seoconsultants.com/tools/headers , using User Agent Lynx 2.8.6, and there get 406 for "GET," "HEAD," and "POST."
Last edited by deepsand; 12-18-2010 at 03:44 PM. Reason: added Addendum