Thanks to semAdvance and Hollyhttp
for your good information about how the search bots work. I mostly use the full URL in my links anyway, so I suppose I was lucky in doing so!
To sum up then, I think my information about needing a robots.txt file is erroneous. As long as my links are full links, I can just let the bots sort out what content belongs to which site.
When I am ready, I can get a new domain name, use the pointing manager to set it to the correct folder, and I should be set. If the pointing manager creates an htaccess file for me, I need not do anything else. If not, I create one using Speed's code and stick it into the folder of the add-on domain.
Plus... for extra bonus, I check my current htaccess file in my root to see if it points my non-www.maindomain to the www.maindomain. If not, I add the code to gather up all my link juice!
Just keep one thing in mind. When you host multiple domains on the same IP address, the search engines will recognize it you want it or not. The cross-promotion will suffer.
I faced the same situation as you, and lately Google downgraded my web site by 2PRs down in one single day!
The Cyber Teacher. http://www.rtek2000.com - Discounted Self-Study packages for IT certs
http://www.800-webdesign.com/free-we...resources.html -Web Master's Resources
Just for clarification: Are you heavily promoting one site from the other? I can see Google dropping internal promotion, but if you get back links from OUTSIDE sources (outside your domains), then I would think there should be no penalty for simply having several sites hosted in one account. Can you clarify your findings for everyone's benefit, please?
Just as an aside: Google does funny things at times, so when you see sudden changes in rank, it could be for any number of reasons and it is sometimes difficult to be sure exactly what that reason is. Often, those changes are only temporary, so you need to see what happens over a period of time.
They are browser agents which, like any other agent, must make specific requests for a specific resource on a server, on at a time. And, in order to have such requests filled, the sought after resource must be one which is marked ad being available to the public.
They do not have access privileges such that they can search a (sub-)directory's directory entries so as to identify all resources pointed to by that directory. (A notable exception is where unrestricted access is allowed, such as an FTP server which allows anonymous log-in.)
I was following this thread with some interest. I use add-on domains for almost all of my seperate websites. I also use wordpress as my content management tool. In order to mask the sub-domain I change the setting in wordpress to only show the new site domain.
Is this a good idea?
Otherwise just use the .htaccess file I gave further up in each addon and it will redirect visitors and bots that try and access an addon via a folder of the main domain.