View Full Version : .htaccess / ModRewrite not allowed for security reasons?
10-28-2003, 08:32 PM
I tried using a .htaccess file containing mod/write 's and my hosting provider doesn't allow this particular file due to "security reasons" - caused my site to be down for 2 days, before I realized the problem. I'm trying to block webstrippers and bad bots with this. I don't understand what would cause a security problem, since I'm not a programmer. I collected the information from various sources on the internet.
Is there an alternative way to block the nasty bots without using the mod/write lines?
10-29-2003, 02:57 PM
because with modrewrite you could have it direct to any world readable file on the server.
10-29-2003, 03:18 PM
Thanks for that explanation.
So, is it the same sense of security using a robots.txt file with references to those bots?
I don't see the difference between the robots.txt file and .htaccess file, except the fact that some robots ignore the robots.txt file, correct?
10-29-2003, 03:31 PM
In some ways you are correct. the .htaccess file can also control many other things on the server if something is wrong it that may lead to instability on the server and not just your site.
10-29-2003, 04:07 PM
So, if I'm using .htaccess files (for example, I have one that points to a custom 404 page) on my hosting company's server - not my own - I could cause problems on THEIR server? And would this cause problems for everyone else on the particular server, as well?
ooohh...man. I didn't realize the capabilities or possibilities associated with a simple htaccess file!
10-30-2003, 12:59 PM
There is not much, if anything at all in the .htaccess that would affect server stability. They apply only to the folder, and sub folders of that folder. So if you put a .htaccess in your document folder, it will have an effect on that folder, plus every subfolder, unless you create one within the subfolder and override it.
Even silly things like sending a 404 document to a document that doesn't exist could cause an infinite loop of 404 errors, but apache is not stupid, and will catch on the first iteration and output something like "in addition there was a 404 error trying to retrieve ErrorDocument".
robots.txt is a file that a robot should read, and obey.
.htaccess is something with directs apache what to do. If you block the bot, the bot will simply receive a forbidden code, not even being able to view your site.
Things like php and cgi can output any world readable file too, unless the server is otherwise setup.