14 golden rules for site optimization

golden rules

1: Minimize HTTP Requests

80% of the end-user response time is spent on the front-end. Most of this time is tied up in downloading all the components in the page: images, stylesheets, scripts, Flash, etc. Reducing the number of components in turn reduces the number of HTTP requests required to render the page. This is the key to faster pages.

2: Use a Content Delivery Network

The user’s proximity to your web server has an impact on response times. Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user’s perspective.

3: Add an Expires Header

Web page designs are getting richer and richer, which means more scripts, stylesheets, images, and Flash in the page. A first-time visitor to your page may have to make several HTTP requests, but by using the Expires header you make those components cacheable. This avoids unnecessary HTTP requests on subsequent page views. Expires headers are most often used with images, but they should be used on all components including scripts, stylesheets, and Flash components.

4: Gzip Components

The time it takes to transfer an HTTP request and response across the network can be significantly reduced by decisions made by front-end engineers. It’s true that the end-user’s bandwidth speed, Internet service provider, proximity to peering exchange points, etc. are beyond the control of the development team. But there are other variables that affect response times. Compression reduces response times by reducing the size of the HTTP response.

5: Put Stylesheets at the Top

Front-end engineers that care about performance want a page to load progressively; that is, we want the browser to display whatever content it has as soon as possible. This is especially important for pages with a lot of content and for users on slower Internet connections. The importance of giving users visual feedback, such as progress indicators, has been well researched and documented. In our case the HTML page is the progress indicator! When the browser loads the page progressively the header, the navigation bar, the logo at the top, etc. all serve as visual feedback for the user who is waiting for the page. This improves the overall user experience.

6: Put Scripts at the Bottom

Rule 5 described how stylesheets near the bottom of the page prohibit progressive rendering, and how moving them to the document HEAD eliminates the problem. Scripts (external JavaScript files) pose a similar problem, but the solution is just the opposite: it’s better to move scripts from the top to as low in the page as possible. One reason is to enable progressive rendering, but another is to achieve greater download parallelization.

7: Avoid CSS Expressions

The problem with expressions is that they are evaluated more frequently than most people expect. Not only are they evaluated when the page is rendered and resized, but also when the page is scrolled and even when the user moves the mouse over the page. Adding a counter to the CSS expression allows us to keep track of when and how often a CSS expression is evaluated. Moving the mouse around the page can easily generate more than 10,000 evaluations.

8: Make JavaScript and CSS External

Using external files in the real world generally produces faster pages because the JavaScript and CSS files are cached by the browser. JavaScript and CSS that are inlined in HTML documents get downloaded every time the HTML document is requested. This reduces the number of HTTP requests that are needed, but increases the size of the HTML document. On the other hand, if the JavaScript and CSS are in external files cached by the browser, the size of the HTML document is reduced without increasing the number of HTTP requests.

9: Reduce DNS Lookups

The Domain Name System (DNS) maps hostnames to IP addresses, just as phonebooks map people’s names to their phone numbers. When you type http://www.yahoo.com into your browser, a DNS resolver contacted by the browser returns that server’s IP address. DNS has a cost. It typically takes 20-120 milliseconds for DNS to lookup the IP address for a given hostname. The browser can’t download anything from this hostname until the DNS lookup is completed.

10: Minify JavaScript

Minification is the practice of removing unnecessary characters from code to reduce its size thereby improving load times. When code is minified all comments are removed, as well as unneeded white space characters (space, newline, and tab). In the case of JavaScript, this improves response time performance because the size of the downloaded file is reduced.

11: Avoid Redirects

Redirects are accomplished using the 301 and 302 status codes. Here’s an example of the HTTP headers in a 301 response:

      HTTP/1.1 301 Moved Permanently

      Location: http://example.com/newuri

      Content-Type: text/html

The browser automatically takes the user to the URL specified in the Location field. All the information necessary for a redirect is in the headers. The body of the response is typically empty. Despite their names, neither a 301 nor a 302 response is cached in practice unless additional headers, such as Expires or Cache-Control, indicate it should be. The meta refresh tag and JavaScript are other ways to direct users to a different URL, but if you must do a redirect, the preferred technique is to use the standard 3xx HTTP status codes, primarily to ensure the back button works correctly.

12: Remove Duplicate Scripts

It hurts performance to include the same JavaScript file twice in one page. This isn’t as unusual as you might think. A review of the ten top U.S. web sites shows that two of them contain a duplicated script. Two main factors increase the odds of a script being duplicated in a single web page: team size and number of scripts. When it does happen, duplicate scripts hurt performance by creating unnecessary HTTP requests and wasted JavaScript execution.

13: Configure ETags

The problem with ETags is that they typically are constructed using attributes that make them unique to a specific server hosting a site. ETags won’t match when a browser gets the original component from one server and later tries to validate that component on a different server, a situation that is all too common on Web sites that use a cluster of servers to handle requests. By default, both Apache and IIS embed data in the ETag that dramatically reduces the odds of the validity test succeeding on web sites with multiple servers.

14: Make Ajax Cacheable

One of the cited benefits of Ajax is that it provides instantaneous feedback to the user because it requests information asynchronously from the backend web server. However, using Ajax is no guarantee that the user won’t be twiddling his thumbs waiting for those asynchronous JavaScript and XML responses to return. In many applications, whether or not the user is kept waiting depends on how Ajax is used. For example, in a web-based email client the user will be kept waiting for the results of an Ajax request to find all the email messages that match their search criteria. It’s important to remember that “asynchronous” does not imply “instantaneous”.

More details about the 14 rules can be found here: http://developer.yahoo.com/performance/rules.html

It is also possible to try all these rules on practice at http://stevesouders.com/examples/rules.php

A great web based tool for site loading time analyze is available at http://tools.pingdom.com/fpt/


Learn .htaccess in 10 minutes

Mode rewrite

A list of htaccess code snippets and examples.
Any web designer MUST know them.

Any htaccess rewrite examples should always begin with:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /

This lets google crawl the page, lets me access the whole site ( without a password, and lets my client access the page WITH a password. It also allows for XHTML and CSS validation! (w3.org):

AuthName “SiteName Administration”
AuthUserFile /home/sitename.com/.htpasswd
AuthType basic
Require valid-user
Order deny,allow
Deny from all
Allow from 24\\.205\\.23\\.222
Allow from w3.org htmlhelp.com
Allow from googlebot.com
Satisfy Any

Make any file be a certain filetype (regardless of name or extension)
#Makes image.gif, blah.html, index.cgi all act as php
ForceType application/x-httpd-php:

Redirect non-https requests to https server fixing double-login problem and ensuring that htpasswd authorization can only be entered using HTTPS

SSLOptions +StrictRequire
SSLRequire %{HTTP_HOST} eq “google.com”
ErrorDocument 403 https://google.com

SEO Friendly redirects for bad/old links and moved links

For single moved file:

Redirect 301 /d/file.html http://www.htaccesselite.com/r/file.html

For multiple files like a blog/this.php?gh:

RedirectMatch 301 /blog(.*) http://www.askapache.com/$1

different domain name:

Redirect 301 / http://www.newdomain.com

Require the www:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_URI} !^/robots\\.txt$
RewriteCond %{HTTP_HOST} !^www\\.example\\.com$ [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [R=301,L]

Require the www without hardcoding:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_URI} !^/robots\\.txt$ [NC]
RewriteCond %{HTTP_HOST} !^www\\.[a-z-]+\\.[a-z]{2,6} [NC]
RewriteCond %{HTTP_HOST} ([a-z-]+\\.[a-z]{2,6})$ [NC]
RewriteRule ^/(.*)$ http://%1/$1 [R=301,L]

Require no subdomain:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_URI} !^/robots\\.txt$
RewriteCond %{HTTP_HOST} \\.([a-z-]+\\.[a-z]{2,6})$ [NC]
RewriteRule ^/(.*)$ http://%1/$1 [R=301,L]

Require no subdomain:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} \\.([^\\.]+\\.[^\\.0-9]+)$
RewriteCond %{REQUEST_URI} !^/robots\\.txt$ [NC]
RewriteRule ^(.*)$ http://%1/$1 [R=301,L]

Redirect everyone to different site except 1 IP address (useful for web-development):

ErrorDocument 403 http://www.someothersite.com
Order deny,allow
Deny from all
Allow from

Add a “en-US” language tag and “text/html; UTF-8” headers without meta tags:

AddDefaultCharset UTF-8
# Or AddType ‘text/html; charset=UTF-8’ html
DefaultLanguage en-US

Using the Files Directive:

AddDefaultCharset UTF-8
DefaultLanguage en-US

Using the FilesMatch Directive (preferred):

AddDefaultCharset UTF-8
DefaultLanguage en-US

Securing directories: Remove the ability to execute scripts:

AddHandler cgi-script .php .pl .py .jsp .asp .htm .shtml .sh .cgi
Options –ExecCGI
Only allow GET and PUT request methods to your server.
Options -ExecCGI -Indexes -All +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteRule .* – [F]

Processing All gif files to be processed through a cgi script:

Action image/gif /cgi-bin/filter.cgi

Process request/file depending on the request method:

Script PUT /cgi-bin/upload.cgi

Force Files to download, not be displayed in browser:

AddType application/octet-stream .avi
AddType application/octet-stream .mpg

Dramatically Speed up your site by implementing Caching!:

Header set Cache-Control “max-age=2592000”
Header set Cache-Control “max-age=604800”
Header set Cache-Control “max-age=43200”

Prevent Files image/file hotlinking and bandwidth stealing:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\\.)?askapache.com/.*$ [NC]
RewriteRule \\.(gif|jpg|swf|flv|png)$ http://www.askapache.com/evil-hotlinker.gif [R=302,L]


ErrorDocument 404 /favicon.ico
ErrorDocument 403 https://secure.htaccesselite.com
ErrorDocument 404 /cgi-bin/error.php
ErrorDocument 400 /cgi-bin/error.php
ErrorDocument 401 /cgi-bin/error.php
ErrorDocument 403 /cgi-bin/error.php
ErrorDocument 405 /cgi-bin/error.php
ErrorDocument 406 /cgi-bin/error.php
ErrorDocument 409 /cgi-bin/error.php
ErrorDocument 413 /cgi-bin/error.php
ErrorDocument 414 /cgi-bin/error.php
ErrorDocument 500 /cgi-bin/error.php
ErrorDocument 501 /cgi-bin/error.php

Authentication Magic

Require password for 1 file:

AuthName “Prompt”
AuthType Basic
AuthUserFile /home/askapache.com/.htpasswd
Require valid-user

Protect multiple files:

AuthName “Development”
AuthUserFile /.htpasswd
AuthType basic
Require valid-user

Example uses of the Allow Directive:

# A (partial) domain-name
Allow from

# Full IP address
Allow from

# More than 1 full IP address
Allow from

# Partial IP addresses
# first 1 to 3 bytes of IP, for subnet restriction.
Allow from 10.1
Allow from 10 172.20 192.168.2

# network/netmask pair
Allow from

# network/nnn CIDR specification
Allow from

# IPv6 addresses and subnets
Allow from 2001:db8::a00:20ff:fea7:ccea
Allow from 2001:db8::a00:20ff:fea7:ccea/10

Using visitor dependent environment variables:

SetEnvIf User-Agent ^KnockKnock/2\\.0 let_me_in
Order Deny,Allow
Deny from all
Allow from env=let_me_in

block access to files during certain hours of the day:

Options +FollowSymLinks
RewriteEngine On
RewriteBase /
# If the hour is 16 (4 PM) Then deny all access
RewriteCond %{TIME_HOUR} ^16$
RewriteRule ^.*$ – [F,L]

GOal Oriented Desing is GOOD


GOAL oriented design

Once upon a time, when we just began working in web design, our only goal was to impress the client. Today we consider this vision is totally wrong, except the case when a web site has one single user – the client.

Any web site should be built in response to certain needs. The web is medium of information, and the content is king. No matter how good the design will be, without fulfilling the needs for information – the site will be a failure.

Goals are best visualized when you make scenarios for real user experience. A common mistake is focusing on the technologies and forgetting about how easy will the real user find the information he needs, and what will be the overall impressions after using the site. The point is to make the user happy and reward it with something he needs in response to his time spend.

Our advice is to think of a web site as of a two-way dialog medium. Imagine yourself chatting with that site and getting pleasant responses, you will be delighted with the feeling it works for you. Here we would like to introduce the so called – “system intelligence” this concept provides the greatest opportunity to differentiate excellent web products from the mainstreams. Think ahead what dialogues should be anticipated or avoided, what errors can possible occur, how can the site respond in a way that increases the user’s satisfaction. Also the “404 – File not found” solution is actually very rude! You could design the system to pull up a list of similar pages to look through – this is much more user friendly.

A good and intelligent system is always predicting the next step of the user, and acts accordingly. Even if this requires more resources to design such a system there always are win-win approaches that will work for your users and your goals. Here are some examples of win-win solutions:

Advertising messages at resting pages.
Let’s say there is a downloading page, the user is happy with the download and meanwhile can actually pay attention on some advertising.

Inline advertising.
It’s easy to imagine a newspaper and advertising boxes through the lines. The user can easily ignore it, or sneak a look.

Advertising by entertainment.
Probably the most pleasant way of advertising – it’s always very interesting and funny to observe a cartoon message that invites you to click and find out the rest of the story.

The cat and mouse approach.
It’s when you want to sell the cat, but you offer the mouse for free and the user has the impression he still wants more. The user discovers something new or of interest, you are advertising your services and as a consequence sell more. It’s a trivial win-win isn’t it?

Advertising via Newsletters.
It’s a perfect way to provide the right users the right information, without their having to go and get it, as soon as it becomes available. Also it collects the users’ information as contact details which represent a route of direct communication and target oriented actions. Besides, it is a very cost-effective win-win solution.

Even this post has some goals, one of which is to encourage you finding win-win solutions and always use the GOOD philosophy.