PDA

View Full Version : htaccess referer blocking




124151155
Jul 6, 2010, 08:06 PM
Hi,

So basically I want to restrict the access to a particular subdomain (Which just points to a subdirectory on the website) to only traffic that has come from this website (trapdoor.com.au) - so the purpose is that it can only be accessed through a link on the website, and not when it is typed in or linked to from elsewhere.

The .htaccess file in the subdirectory (which is where the subdomain goes to) but there is also the main one in the root directory of the site.

This is the code I have tried, but have not been successful with

SetEnvIf Referer trapdoor\.com.au internal
SetEnvIf Referer ^$ internal
#
<Files *>
Order Deny,Allow
Deny from all
Allow from env=internal
</Files>


By the looks of things, that looks like it should do what I'm trying to achieve, but it doesn't seem to do anything. I can access the files within the subdirectory or subdomain even without being referred from trapdoor.com.au.

Any suggestions?
I'm pretty new to modifying the htaccess file, so it could be something very simple for all I know.

Cheers

Tim



angelwatt
Jul 6, 2010, 09:11 PM
It looks pretty good. Here's an alternative to try.

SetEnvIfNoCase Referer "trapdoor.com.au" internal
SetEnvIf Referer ^$ internal

<Limit GET POST HEAD>
Order Deny,Allow
Deny from all
Allow from env=internal
</Limit>
I generally do this stuff to block rather than accept individuals though.

As a note, this type of blocking will not ensure blocking since you allow an empty referrer through and the referrer information can easily be faked.

124151155
Jul 6, 2010, 09:24 PM
Brilliant, I'll give this a shot.

The content isn't super sensitive, so a simple type of referer blocking like this would be sufficient. If someone really wants to fake a referer to see the content, they can, but they really won't be gaining much.

You mentioned a blank referer could get to this - does this mean if someone enters a URL from their browser rather than clicking a link on another website, they could access it? If so, can the blank referer be blocked?

Thanks for your help.

angelwatt
Jul 6, 2010, 09:30 PM
You mentioned a blank referer could get to this - does this mean if someone enters a URL from their browser rather than clicking a link on another website, they could access it? If so, can the blank referer be blocked?

Correct, entering the URL directly would get to the page without a referrer and thus be let in. The second line of the .htaccess given here is what allows an empty referrer. So removing this line would require the referrer from the first line to be present.

124151155
Jul 6, 2010, 09:58 PM
As in this?


SetEnvIfNoCase Referer "trapdoor.com.au" internal
#SetEnvIf Referer ^$ internal

<Limit GET POST HEAD>
Order Deny,Allow
Deny from all
Allow from env=internal
</Limit>

It doesn't seem to work... =/

EDIT: Holdon, now I'm trying it in an other browser, and it seems to be working differently, I guess chrome and firefox are now just using a chached copy of the document or something...
EDIT: Nup, it appears to work exactly the same whether the second line is on or off =(

124151155
Jul 6, 2010, 10:28 PM
I removed the ^$ from the second line (does this mean empty referrer?) but this just results in an internal server error (500) when accessed from any referrer (other site, empty and trapdoor).

Any ideas how this could be fixed?

Thanks for your help.

angelwatt
Jul 7, 2010, 06:50 AM
I removed the ^$ from the second line (does this mean empty referrer?) but this just results in an internal server error (500) when accessed from any referrer (other site, empty and trapdoor).

In regular expressions, ^ represents the start of the string and $ represents the end of the string. It may need quotes around it ("^$"). .htaccess files can be very picky sometimes (The error 500 generally points to a syntax issue in the .htaccess file). Alternatively, you could use "" to indicate an empty string.

As part of my .htaccess where I block image hot-linking, I do a check to make sure the referrer isn't blank, !^$. I do this within some rewrites. You may be able to use rewrites for what you're doing. Below is what I would try out.

RewriteEngine On
# referrer must not be blank
RewriteCond %{HTTP_REFERER} !^$
# List OK urls
RewriteCond %{HTTP_REFERER} !trapdoor\.com\.au [nocase]
# Send them to home page
RewriteRule .* / [last]

124151155
Jul 7, 2010, 06:54 AM
Thanks for the help, I'll connect to my mac at home and try it out now (No FTP client here...) :)

124151155
Jul 7, 2010, 07:16 AM
Ok, so I tried out your code, angelwatt, I changed the tag to [F] to produce a forbidden error instead. It seems to block empty and other referrers but it is blocking trapdoor.com.au as well. Is something missing from line 5?

Thanks

angelwatt
Jul 7, 2010, 05:13 PM
Ok, so I tried out your code, angelwatt, I changed the tag to [F] to produce a forbidden error instead. It seems to block empty and other referrers but it is blocking trapdoor.com.au as well. Is something missing from line 5?

So essentially, it's blocking everything. I'm not sure what else to try, it matches the type of stuff I have in my own .htaccess. Have you tried in multiple browsers and checked that the referrer info is being passed correctly? Browsers can block the referrer using add-ons.

124151155
Jul 7, 2010, 08:03 PM
Yeah, I've tried in many browsers - all are clean installs without any addons, too.

I might play around with a few things to see what I can find that will work.

Thanks :)

124151155
Jul 7, 2010, 08:29 PM
I don't know much about htaccess, but in the 5th line;
RewriteCond %{HTTP_REFERER} !trapdoor\.com\.au [nocase]
Does this actually specifiy that this referrer is allowed? It looks to be mch the same format as line three;
RewriteCond %{HTTP_REFERER} !^$
So it appears the code specifies to conditions, and then says they should be given a forbidden message. Is this correct or am I missing something?

angelwatt
Jul 7, 2010, 09:02 PM
The logic can be a little confusing. Below I have written in a way that may be clearer. And doing so makes me realize that the !^$ line isn't needed at all. For my hotlinking, I actually allow an empty referrer, I just hadn't updated my comment in my code so it confused me.
if (referrer != "" && referrer != "trapper.com.au") {
access forbidden
}
else {
carry on
}
So when you remove that line, the logic becomes,
if (referrer != "trapper.com.au") {
access forbidden
}
else {
carry on
}

124151155
Jul 8, 2010, 02:33 AM
Isn't that forbidding trapper.com.au ??

Technology confuses me sometimes!! >_<

124151155
Jul 8, 2010, 02:52 AM
Is there such thing as a "does not equal" in htaccess files?
Could I have something that achieves this;
RewriteCond %{HTTP_REFERER} ≠ !trapdoor\.com\.au [nocase]

EDIT: Oh, is that what the ! does?

EDIT: Gah, I've removed my .htaccess file because all the changes hadn't have seemed to be working, and it seems that the whole directory still says forbidden... I might contact the host, as perhaps something has been overriding the htaccess changes and something actually would have worked...

angelwatt
Jul 8, 2010, 06:44 AM
Is there such thing as a "does not equal" in htaccess files?
Could I have something that achieves this;
RewriteCond %{HTTP_REFERER} ≠ !trapdoor\.com\.au [nocase]

EDIT: Oh, is that what the ! does?
Yup, the ! negates the expression. I guess I assumed that was obvious.

EDIT: Gah, I've removed my .htaccess file because all the changes hadn't have seemed to be working, and it seems that the whole directory still says forbidden... I might contact the host, as perhaps something has been overriding the htaccess changes and something actually would have worked...

Removing the .htaccess completely would have been a good debug step earlier. While you're FTP'd into your host you can check the permissions on the folder. It should be 755 (rwxr-xr-x). Files should be 644 (rw-r--r--) (this includes the .htaccess file).

124151155
Jul 8, 2010, 07:17 AM
Yup, the ! negates the expression. I guess I assumed that was obvious.

I don't come from a coding background but my father (who does) mentioned at != means does not equal in many languages, so that's the only reason I suspected it.

I contacted the host and they've acknowledged that something has buggered up, but it hasn't been fixed yet. I'm sure one of the codes you supplied will do the trick :D

Cheers!

needlnerdz
Jul 8, 2010, 07:18 AM
Chiming in with an alternative solution.. I'm not sure how many pages/content you have on your subdomain site - let alone if you are programming in php?

But in the case that you are using PHP.. something that I have often done to prevent hijacking or direct entering of script/function pages- is to set a bizarre session ID on the initial page and then when the user is directed to the secondary page, at the top, simply include a script that checks for that session value.. if it's set- show them the page/content.. if it's not, redirect them elsewhere. I would like to think this makes it harder for someone to fake the referrer business.. however it all depends on whether or not you're using PHP and if this subdomain is serving many or just a few pages. Hope that helps!

124151155
Jul 8, 2010, 07:42 AM
Thanks, needlnerz. The site is php, because it uses the Joomla CMS, but I'm not a prgrammer, I only really do the design, so I don't really have the expertise.
I am however, about to start learning PHP, so I'll keep this technique in mind for when I know what I'm doing XD

needlnerdz
Jul 9, 2010, 07:06 AM
Hmm I have no idea how easy/hard it is to customize aspects of the Joomla CMS.. but if you have access and feel comfortable adding a bit of PHP - then hopefully this gets you started with how to implement such a thing [took me a while to figure out, create, soo i'm happy to save you the time if its the right solution]:

<?php

// On the page that 'must' be visited first
// I start a session and create two session variables, IP address + random password
session_start();
$_SESSION['ip_address'] = $_SERVER['REMOTE_ADDR'];
$_SESSION['password'] = 'blah';


// On the landing page
// I check for both of those before doing anything else...
$remote_address = $_SERVER['REMOTE_ADDR'];
$remote_pass = 'blah';
session_start();

// Then check to see if the existing 'ip_address' matches the new one we just grabbed 'remote_address'..
// same goes for the random password..
if (($_SESSION['ip_address'] == $remote_address) && ($_SESSION['password'] == $remote_pass)){
// the proceed with actual content
// it can be directly here written here.. or as a variable that is then 'echo'ed below in the main content..

}else{
header('Location: http://www.somewhereelse.com');
// perhaps just use that initial page they should have clicked from..
}

?>

- I'm curious if a better php programmer sees any major flaws with such a technique?

124151155
Jul 9, 2010, 08:05 AM
The hosts for this site are so dumb -_- the directory was stuffed so they disabled the root .htaccess by renaming it to htaccess.txt - which made the site go down because all the links (SEO) broke and it made the htaccess file temporarily accessible to hackers =S

Nevertheless, I was able to use the second code you gave me with a few modifications and it now works perfectly!!

Thank you very much for you help :) It's greatly appreciated.

Tim

EDIT: Needlnerz, thanks for the code :) I'll keep it handy next time I need to do this again!

angelwatt
Jul 9, 2010, 08:31 PM
I'm curious if a better php programmer sees any major flaws with such a technique?

Well, it can be beaten, but depending on if you're really aiming for security or just blocking most people it would be enough. The sessions are transmitted along with the user, and the user could analyze their own http traffic and read what's in the session and beat it if they so desire. But for casually enforcing people to come from a certain page, it's likely good enough.

needlnerdz
Jul 10, 2010, 07:41 AM
...But for casually enforcing people to come from a certain page, it's likely good enough.

Great - good to know its not 100% but likely enough for all low-risk stuff. Thanks-