Benchmarking your Site Performance with

There’s no excuse for having a slow website in 2014.

Web pages are getting bigger all the time with larger images and lots of clever elements but then again, just because people have super-fast connections at home or work it doesn’t mean that they have these luxuries when they are using a mobile device on a 3G connection. We should all be advocates of prioritising performance.


There are a lot of blog posts and videos out there advocating best practises for site speed and site performance so I won’t go in to too much detail here – but what is frustrating is that a lot of Web Performance tools that people recommend only allow you to look at a few web pages at a time and this means that it can be difficult or time consuming to run a large scale benchmarking analysis or even a full site analysis. is an open source web performance tool that can provide a lot of information about a website and how well it performs against current best practises.

With you have the ability to analyse multiple websites and webpages at the same time, not only that you’re able to control the crawl depth, exclude certain pages or simply load in a txt file of urls to report on. also uses the Navigation Timing API to collect metrics.

The reports are returned in HTML and XML formats and give a lot of insight into your site’s performance.

Let’s look at a few.

The Summary Report returns a high level scorecard which is based upon the classic YSlow rules & new rules, all are based on performance best practices.


You can then choose a more detailed summary from this page which statistically analyses all of the optimization, content, and performance metrics.


If you are a keen UX’er then you will love their web performance metrics which are available in the latest browsers from the W3C timings API.


The Pages report gives you an analysis of all the pages crawled


You can drill down further into each individual page to get a report on the different assets they contain, and any performance issues.


Finally you have an overall Asset Report which helps you to see all the different elements used on a site, which are performing poorly and how frequently they are used.

You can also choose to take screenshots of the different pages you have crawled too.

How to Install on Windows

You will need to be comfortable using command line tools.

First of all: all files need to be in your PATH, if you don’t know how to do that read more here.

1. Install Git Bash

2. Install Java 1.6 or higher (most people can skip this)

3. Install the latest version of phantomJS and then check in your command screen phantomjs –version


4. Download or clone from Github

If you want to collect Navigation Timing Data you should also install:

5. Internet Explorer Driver and ChromeDriver

If you are a Mac or Linux User check the documentation

How to Use


This will run all the reports for the URL you specified at a crawl depth of 1

$./bin/ -u
To change the crawl depth to 2

$ ./bin/ -u -d 2
Limit the number of Pages to crawl

$ ./bin/ -u -j 10
To run a report based on a text file of urls

$ ./bin/ -f yoururls.txt
To take screenshots

$ ./bin/ -k true
Exclude keyword in URL Structure

$./bin/ -u -s /keyword/
Crawl only URLs with a Keyword

$ ./bin/ -u -q /keyword/
Full List of Options

usage: ./bin/ options

-h Help
-u The start URL of the crawl: http[s]://host[:port][/path/]. Use this or use the -f file option.
-f The path to a plain text file with one URL on each row.
-d The crawl depth, default is 1 (one page and all links pointing to the same domain on that page) [optional]
-q Crawl URLs only URLs that contains this keyword in the path [optional]
-s Skip URLs that contains this keyword in the path [optional]
-p The number of processes that will analyze pages, default is 5 [optional]
-m The memory heap size for the Java applications, default is 1024 Mb [optional]
-n Give your test a name, it will be added to all HTML pages [optional]
-o The output format, always output as HTML and you can also output a CSV file for the detailed site summary page (csv) [optional]
-r The result base directory, default is sitespeed-result [optional]
-x The proxy host & protocol: [optional]
-t The proxy type, default is http [optional]
-a The full User Agent string, default is Chrome for MacOSX. You can also set the value as iphone or ipad (will automagically change the viewport) [optional]
-v The view port, the page viewport size WidthxHeight, like 400×300, default is 1280×800 [optional]
-y The compiled YSlow file, default is dependencies/yslow-3.1.5-sitespeed.js [optional]
-l Which ruleset to use, default is the latest version for desktop [optional]
-g The columns showed on detailed page summary table, see for more info [optional]
-b The boxes showed on site summary page, see for more info [optional]
-j The max number of pages to test [optional]
-k Take screenshots for each page (using the configured view port). Default is false. (true|false) [optional]
-c Choose which browser to use to collect timing data. You can set multiple browsers in a comma separated list (firefox|chrome|ie) [optional]
-z The number of times you should test each URL when fetching timing metrics. Default is three times [optional]
-V Show the version of
Benchmark your site against others

Benchmarks are an important area for understanding how your efforts compare to other organisations. They are what allow your organisation to know if you are doing better than others, or to help identify where there needs to be improvements. Without them you are left wondering if your results are good, even if they are improved against previous outcomes within your organisation.

It’s all well and good seeing what you need to do versus best practises but sometimes getting buy in for further improvements from senior management can be tough. Occasionally it’s easier to show them how well you are performing against a competitor in the same industry.

To run a benchmark report versus different websites simply load a list of sites into a .txt file and run

./bin/ -i urls.txt
This will return a slightly different report with a benchmark summary, in this instance I set a crawl limit to the top 10 pages for the below 4 sites.


From here you can now drill down to see which pages are the best/worst performing on each site.

As you can see it’s a pretty expansive tool and one we regularly use as part of our website audits with our clients.


Leave a Reply

Your email address will not be published. Required fields are marked *

Ready when you are

What's the best way to contact you?


Or call +44 (0) 808 22 444 22 to get things moving