Submit Your Article Forum Rules

Results 1 to 7 of 7

Thread: Possible to give access to site code but also prevent download of files?

  1. #1
    Junior Member
    Join Date
    Apr 2009
    Posts
    4

    Possible to give access to site code but also prevent download of files?

    Hi,
    I have to let someone study my site's architecture in order to come up with the best scenarios for load and performance testing, but I don't want them to be able to just drag and drop all my files via ftp. Is there a way to achieve this? Maybe there is a program that I am unaware of??? Thanks in advance!

  2. #2
    Junior Member
    Join Date
    Apr 2009
    Posts
    4

    Re: Possible to give access to site code but also prevent download of files?

    I believe I have figured it out. You can probably put everything in a folder and give the user write only permission or write-execute.

  3. #3
    Administrator weegillis's Avatar
    Join Date
    Oct 2003
    Posts
    5,771

    Re: Possible to give access to site code but also prevent download of files?

    They could use a program to capture the website. WinHTTrack comes to mind, but I don't know if it's still around. They wouldn't need FTP access, at all.

    Another option would be to drop in a PHP script and get it to spider the site to create a site map.

    Or you could create a Google site map, then port it over to an XSLT style sheet to render the links.

    Code:
    <?xml version="1.0" encoding="utf-8"?>
    <!DOCTYPE xsl:stylesheet  [
     <!ENTITY nbsp   "*">
     <!ENTITY copy   "">
     <!ENTITY reg    "">
     <!ENTITY trade  "™">
     <!ENTITY mdash  "—">
     <!ENTITY ldquo  "“">
     <!ENTITY rdquo  "”"> 
    ]>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <xsl:output method="html" encoding="utf-8" doctype-public="-//W3C//DTD XHTML 1.0 Strict//EN" doctype-system="http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"/>
    <xsl:variable name="header">
        <tr bgcolor="#cccccc">
          <th align="left">Location</th>
          <th align="left">Priority</th>
          <th align="left">last Modified</th>
        </tr>
    </xsl:variable>
    <xsl:template match="/">
    
    <html xmlns="http://www.w3.org/1999/xhtml">
    <head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
    <title>XML Site Map Template</title>
    </head>
    
    <body style="text-align:center">
    
     <h2>Site Map Mirror</h2>
        <table id="urls" style="border-collapse:collapse;border:double;width:80%;margin:0 auto;text-align:left">
        <xsl:copy-of select="$header" />
        <xsl:for-each select="urlset/url">
        <tr style="line-height:150%">
          <td><a style="margin-left:1em"><xsl:attribute name="href" ><xsl:value-of select="loc"/></xsl:attribute>   
          <xsl:value-of select="loc"/></a></td>
          <td><xsl:value-of select="priority"/></td>
          <td><xsl:value-of select="lastmod"/></td>
        </tr>
        </xsl:for-each>
        </table>
    
    </body>
    </html>
    
    </xsl:template>
    </xsl:stylesheet>
    Create a duplicate of the Google site map and convert from this,

    Code:
    <urlset
      xmlns="http://www.google.com/schemas/sitemap/0.84"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84
                          http://www.google.com/schemas/sitemap/0.84/sitemap.xsd">
    to this,
    Code:
    <?xml-stylesheet type="text/xsl" href="site_map.xsl" ?>
    <urlset>
    To keep it tucked nicely away, create a folder called 'sitemap' and add an index page (html, php, etc.) with a link to the XML document:
    PHP Code:
    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
    <html xmlns="http://www.w3.org/1999/xhtml" lang="en" xml:lang="en">
    <head>
    <title>Site Map</title>
    </head>

    <body>
    <?php 
    $str 
    "<p><a href=\"index.xml\">Open XML Site Map</a></p>\n";
    echo 
    $str;
     
    ?>
    <p><a href="/">Home</a></p>
    </body>
    </html>
    Mine is called index.php. The php in the code is really just there 'to be php' in the document. No special reason, otherwise.

    Place the three files (index, xml, xsl) in the new directory and access it via http:

    www.yoursite.tld/sitemap

    This will only give you the document structure, and not the path to all the images, but that shouldn't matter. It's the document structure you're concerned with, right?

  4. #4
    Junior Member
    Join Date
    Apr 2009
    Posts
    4

    Re: Possible to give access to site code but also prevent download of files?

    Well Weegillis, I have to say that your detailed reply was a little bit more technical than my current understanding of the situation, but after careful review it seems like the perfect solution. I just don't want them to be able to steal my entire site if they wanted to. Also, they may need to test the HDD/Storage performance, we may need to grant root ssh access to the system. Will they then have access to everything?

    I'm assuming you didn't approve of the ftp method of just granting that particular user "write only" permission to everything?

  5. #5
    Administrator weegillis's Avatar
    Join Date
    Oct 2003
    Posts
    5,771

    Re: Possible to give access to site code but also prevent download of files?

    First, I must confess to know practically nothing about servers, so any comments given are on a 'could be' or perhaps 'common sense' basis. I've only ever worked with shared hosting accounts.
    Quote Originally Posted by sadie8686 View Post
    Also, they may need to test the HDD/Storage performance, we may need to grant root ssh access to the system. Will they then have access to everything?
    At that point, I should think yes. When you say ROOT SSH, may I assume this is your own server? A lot of shared hosting plans don't allow ssh. For a tech to test your server would they need to navigate through your server's control panel? Once logged in, they would definitely have complete control.

    If you're performance testing your own server, give the IT folks what ever access they need to test with and to apply any prescribed remedies. You must trust them to let them under the hood in the first place.

    If, on the other hand, your site is in a shared hosting environment, any performance testing will be pointless, since the servers host hundreds or thousands of websites and you cannot possibly know the full load, let alone your own, with any accuracy. Your host can provide you with the range and parameters you should expect.

    Should all you really be testing is the performance of the site, then http is the way to go. Tweaks in image size, page size, script library size, css size for optimal use will be the only real performance mods you can make, and it doesn't take an IT guy to do most of this. The development/editing team (you?) can handle it.

    Poorly written scripts will necessitate the watchful eye of a knowledgeable coder to spot minuscule errors or wasteful use of memory in the code To benchmark and troubleshoot the tiny issues in javascript that can slow a page down would require at least FTP access to the script folder. CGI and PHP scripts, however, can only be accessed through FTP


    Quote Originally Posted by sadie8686 View Post
    I'm assuming you didn't approve of the ftp method of just granting that particular user "write only" permission to everything?
    FTP access is Read, Execute, Write on all directories in the initial folder. How much authority do you need to assign?

    HTTP access is Read, Execute, -Write on all directories that are accessible; i.e., all directories not restricted by .htaccess.

    Your idea of creating a duplicate of the site and putting it in a sandbox folder is a good one. Granting FTP access and a separate password to that folder would block all access to the actual site. This folder could be the path to your testing server, I believe. One would need to restrict the folder in robots.txt to keep from being accidentally indexed.

    In the meantime, the development crew could have at it until you're ready to implement the changes, which could be done by simply copying the site from the testing server to the live one (after sufficient backing up, of course).

    To put a finer edge on your question, no, I do not approve of unwarranted FTP access. Temporary access should be granted only if there is strict scrutiny and a good deal of trust. When the work is done, change the password. The less people with full control, the better. Chalk it up to paranoia, I guess.

  6. #6
    Junior Member
    Join Date
    Apr 2009
    Posts
    4

    Re: Possible to give access to site code but also prevent download of files?

    Cool that all makes sense. The only other thing I can add to this thread is if you have to grant ftp access to someone for any reason, hopefully, the "write only" permission would be all that you need, but there are products like PA File Sight, that can monitor any actions that are taken by someone when they are accessing files in your server. You can even set it to SMS or email you when advanced file auditing alerts based on user behavior, such as reading all files in a directory (which might be a copy operation). Of course you don't necessarily have to have the program. Of course an NDA is always nice too.

  7. #7
    Administrator weegillis's Avatar
    Join Date
    Oct 2003
    Posts
    5,771

    Re: Possible to give access to site code but also prevent download of files?

    I gather that you don't want your site's material copied and duplicated on other sites. Are you letting total strangers do the server work? Do you really need performance testing, assuming your site and server are functioning normally?

    Can an online tool do the trick (Accessibility, SEO, Validation, etc.)? You need not expose any more of your site than each tool asks for, and if log in access is needed, it is usually launched from a secure server, so no 'humans' are actually logging on, just computers.

    I'd be less concerned about the ripping part, and more concerned with what a person who has FTP access might PUT on the server, or what downloadable files might be altered, be they web pages, pdf files, applications, and so on. They cannot deface a site if no FTP access is allowed, and they cannot surreptitiously create access accounts, back doors, etcetera if no control panel access is allowed. The databases are another thing to protect, as well.

Similar Threads

  1. How to make PDF files available for download
    By Memoire in forum General Tech Help
    Replies: 14
    Last Post: 04-13-2010, 03:29 AM
  2. Does SEO need access to site to give a quote??
    By michstar01 in forum Search Engine Optimization Forum
    Replies: 44
    Last Post: 02-07-2008, 07:35 AM
  3. Server sending PHP files for download!
    By Skywoolf in forum Web Programming Discussion Forum
    Replies: 22
    Last Post: 04-28-2007, 11:48 AM
  4. Prevent Access to Folders
    By purex in forum Internet Security Discussion Forum
    Replies: 11
    Last Post: 08-08-2006, 12:10 PM
  5. Signing download files or not? How to do that?
    By BrowserBob in forum Graphics & Design Discussion Forum
    Replies: 1
    Last Post: 11-01-2004, 09:09 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •