Website Security Beginners Tutorial

In this tutorial we are going to go through some techniques and common use cases for Website Security.

Website security is often overlooked- and that’s understandable, but basic security can be put down to a few simple techniques.

Today we will be going through some basic security measures that you can take with your website hosted on an Apache Web Server.

File Permissions

File permissions can prevent unauthorized or limited access to specific directories and files (including scripts). Certain file permissions allow users to read, write or execute files and we can assign permissions to the file or directory owner, group or world (public users). A website’s root directory (in most cases) should have permissions set to 755- this permission will provide all access for the owner and read & execute privileges for group and public users. In most cases, the group and public users should have limited access and not be able to ‘write’ anything to your directory. File Permissions can be set using an FTP client such as FileZilla, Command Prompt- Windows, Terminal- Macintosh or Command Line- Linux. For special files like .htaccess.

PHP vs. HTML

Websites that are written in the PHP programming language (scripts) can provide additional privacy and security for the contents of the files. Unless a users has physical access to the scripts (directly from the server) the user can only ‘inspect’ the contents that have been ‘echoed’ or ‘printed’ out in each case; opposed to HTML (Hyper Text Mark-up Language) files, which provide the complete contents of the file to any user when ‘inspected’ with Fire-Bug or equivalent (Fire-Fox add-on).

Encoding

PHP scripts can also be encoded or encrypted by using a program such as Zend Guard. Zend Guard will encode the PHP scripts into a binary format and then use obfuscation to completely mask the script’s contents- making it harder; but not impossible to view. Encoding a PHP script with the basic base64_encode() function can prove effective against ‘un-learned’ web users. The function will essentially ‘garble’ the contents of the encoded script, by changing variable names and replacing its contents with base64 compliant characters. This method is generally used for transporting scripts and PHP code via email- where the original code may become corrupt due to certain transport protocols. The bas64_encode() function can be easily be decoded by the base64_decode() function. The main purpose for encoding a php file is in the case that a user had access to the actual source code, they wouldn’t (instantly) be able to directly read the contents of your script.

Encoding/Decoding Example:

<!--?php $string = ‘This string of text will be encoded’; // string to be encoded echo base64_encode($string); // echoes out the encoded text ?-->

			<!--?php $string = ‘v98ewAHa8KB823qodus==’; // string variable with a value of our encoded string echo base64_decode($string); // outputs the decoded string ?-->

.htaccess

.htaccess files are very important when dealing with directories that should not be public users. These .htaccess files allow the Web Master to set specific rules for specific files, file types or specific users. The Web Administrator may set a ‘deny’ or ‘allow ‘rule with a few lines of code.

.htaccess Example:
The following code will deny all http access to the directory that the .htaccess file is placed in, in this case the root directory of the server. This would actually block all access to the website. This is just an example of the power of the power that the .htaccess has over your website.

Order allow,deny
			Deny from all

For administration areas, for example a directory that only a specific group of users should be able to access via http, we can say something like this in our .htaccess file in the directory of the admin area:

order deny,allow
			deny from all
			allow from your.ip.address

As you can see, we are denying all access from all and allowing only the specified ip address to access the directory via http.

You would change ‘your.ip.here’ to the actual IP address that you wish to allow http access from. Note that this is only productive when the allowed users (IP is allowed) have a dedicated IP address that they will be accessing the directory from, other wise a dynamic IP will continue to change and end up disallowing the group of permitted users.

We can also deny all http access to the .htaccess file itself:

			Order allow,deny
			Deny from all

We could also do this with other files, such as php.ini:

			Order allow,deny
			Deny from all

Prevent ‘Listing’ Directories

A little trick to stop any user ‘listing’ a directory’s contents is to put a blank index.php file in that directory.

Prevent Directory Listing Example:
To prevent users from ‘listing’ the /includes/ directory, a PHP file titled index.php containing:

<!--?php //This is a blank php script ?-->

can be put inside the /includes/ directory.

When www.example.com/includes/ is executed in the address bar of a browser- the user will get a blank page, rather than a list of the directories contents. This technique comes in handy, especially when dealing with important files i.e. Copyright protected images, but this will not stop a user from accessing specific files within that directory if they know the file name they are looking for.

Instead of creating a .php script in each directory that you would not like to have ‘listed’, you can add the following to your root web directorie’s .htaccess file:

Options -Indexes

When a directory is listed, the web server type is also displayed, but if we add this line to our .htaccess, this should fix the problem for our Apache Web Server:

ServerSignature Off

Robots.txt

Some pages maybe private, for members only or simply not needed to be found via a Search Engine Query. Specific files and directories can be excluded by Search Engine ‘Spiders’ or ‘robots’ with a simple robots.txt file placed in the root web directory.

Robots.txt Example:
The following Robots.txt file will stop the Search Engine from indexing any files or directories that are listed after the ‘Disallow’ command.

User-agent: *

			Disallow: /main/css/
			Disallow: /includes/

By saying

User-agent: *

– this means that we are telling all Search Engine ‘spiders’ to not index the specified directories and the files contained in those directories.

It is solely upto the Search Engine’s ‘spiders’ to whether they will actually honour your request.

BONUS- File Transfer Tip:

If you have SSH Access to your website, you should be able to utilize the SFTP (SSH File Transfer Protocol or Secure File Transfer Protocol) rather than plain old FTP. This should provide additional security when you are uploading files to your web server.

BONUS- General Tip:

Never, Never rely soley on Client-Side technologies (Such as JavaScript and HTML) to validate any form of user input. Client-Side Script and validation can easily be manipulated by the end user via their browser.

You should always be using Server-Side Technologies to Validate and process user input!

I hope you found this resource useful!