PDA

View Full Version : How to password restrict web pages




Moof1904
May 19, 2008, 10:37 AM
The family has just registered a domain for use in sharing photos among family members. We don't really want to use Flickr or .mac to do photo sharing, we'd rather just put them on the family's domain. But, we don't want random people accessing the site. Is there an easy way for a person with limited web knowledge to password restrict web pages to prevent strangers from accessing and also prevent the pages from being indexed by google and the like?



angelwatt
May 19, 2008, 11:55 AM
There's two decent ways I can recommend off hand. One is if the web server being used is Apache, it can be setup using a .htaccess file. Here's a decent enough tutorial on that. (http://www.javascriptkit.com/howto/htaccess3.shtml) and another decent one. (http://www.ilovejackdaniels.com/apache/password-protect-a-directory-with-htaccess/)

Second way is with a PHP script like this one. (http://www.zubrag.com/scripts/password-protect.php) Though it's more for password protecting a page and isn't so good for protecting a directory. So it'll depend on your needs. Most will recommend the .htaccess way though as it's more secure and reliable.

Moof1904
May 19, 2008, 12:04 PM
Does this do anything to prevent the pages from being indexed by ther various search engines?

angelwatt
May 19, 2008, 12:36 PM
Does this do anything to prevent the pages from being indexed by ther various search engines?

No, it won't. For that you'll want to add meta tags the individual pages. (http://www.robotstxt.org/meta.html) Bad robots may ignore it, but can't really get around that, but shouldn't be an issue, especially once you put password protection into place.

Moof1904
May 19, 2008, 01:24 PM
Great! Thanks!

Darth.Titan
May 19, 2008, 02:19 PM
To do everything you can to stop web crawlers, in addition to meta tags in the page you should look into a robots.txt file on your site as well.
robots.txt (http://www.robotstxt.org/)