Our web site uses .asp to generate thousands of dynamic pages. Due to the inability of spiders to read our dynamically generated pages, we've relied upon a third part product called XQASP. XQASP re-writes the query string to a URL that is searchable by spiders. Google Answers has advocated the use of XQASP back in '03 as a reliable tool that can be used to enable Google spiders to read and index dynamically generated pages.
Two weeks ago we found two things that happened simultaneously; 1) a PR drop on our site and 2) in Google's webmaster tools, diagnostic section, we now have over 2000 unreachable URL's. The reason for the unreachable URL's is listed as a 505 - internal server error.
Interestingly, none of the unreachable URL's listed by Google are to be found in our website's site map. Some, but not all of the unreachable URL's show the re-written query strings generated by XQASP, but not all.
My programmer feels that all of these unreachable URL's are backlinks. I tend to disagree. I've spoke to XDE, the makers of XQASP who have give a couple of suggestions for re-writting our code, but nothing specific to Google updates.
My question is this; have recent Google algorithm changes perhaps made XQASP an obsolete product by tightening the spider traps?