Speed : An intro to website optimisation

This post began life as an internal training document for a number of our team at Bronco. We determined it may be of benefit to others and so have placed a slightly editted version here.

The primary aim of the document was to make our SEO team aware of the areas our developers consider in optimising a website for speed. It mostly provides an overview of common issues and solutions in a language that can be understood without too much technical knowledge and with the intention that any complex analysis be referred back to a developer.

So here it is…

Why?

The answer as to why a website should be fast will be answered differently depending on your position within the company. An SEO may be mostly concerned with the impact on SEO rankings as site speed, and bounce rate factor into the Google Algorithm.

For the web team the concern is that a slow loading website affects conversions. It’s been reported that Amazon at one time saw a 1% increase in conversion for every 100ms (milliseconds) decrease in loading time.

While the average connection speeds increase year on year this doesn’t mean that designers, developers or clients can ignore site speed. With a vast number of people still utilising slow connections, especially on mobile, a slow website can be detrimental to the user experience, to conversion and therefore profit.

If you notice a website running slowly on an office PC it’s almost guaranteed mobile users on a 3G connection won’t hang around for the website to load.

It’s also important to note that if a new website is slow this is likely to only get worse as future developments add additional weight to a website often without old or unneeded elements being removed until a full redesign.

Tools

There are a few tools that we use commonly to assess the speed of a website. These are as follows:

Each of these offer a slightly different insight into the speed of a website.

Google PageSpeed Insights

This tool offers an X out of 100 score testing against a number of different speed and user experience issues. It returns different scores for each of these as well as differentiating mobile and desktop.

For many client’s a 100/100 score is near impossible and a perfect score does not guarantee a website is fast.

Pingdom Tools

Pingdom also offer an X out of 100 score but tests against a different set of criteria as well as presenting a waterfall chart of the assets loaded within the website. Helpfully it provides a benchmark of performance against the web as a whole. Ideally we need our websites to be in the faster 50% than the slower.

WebPageTest

Similar to Pingdom Tools but with a more granular breakdown of the speed of the website, providing details such as first byte and start render times. It also ranks some criteria in an A-F scale and in many instances allows you to see a frame by frame preview of the website as it loads.

WebPageTest provides the most detailed information regarding site speed but does little in the way of suggesting improvements.

Google Chrome Developer Tools

While offering a range of resources for developers it also allows you to throttle the loading of the website as if viewing a website from a slower connection. The benefit here is to understand how a website renders on a slow connection and at what point it becomes something a user can interact with.

Interpret, don’t copy

It can be easy to view the information returned by these tools and reproduce the results in a report or in communications directly to a client. Please don’t. While these tools often return valid issues concerning the speed of a website they are automated tools that lack the intelligence needed to deal with shades of grey. Nor can they assess all the possible issues that exist within a website and provide workable solutions.

The results therefore must be interpreted to determine what is feasible within a given website to ensure recommendations are achievable. If you’re unable to interpret the responses these tools output, ask a developer to help.

Benchmarks

It’s impossible to say a website should load in X seconds or be X kilobytes in size. It will always depend on the type of site, its feature set and the server setup.

If you want to understand if the results suggest the website is too slow or too large a developer should be able to interpret the data and comment on if the figures are excessive for the site in question.

In almost all cases a website has the potential to be optimised to load faster.

Common Issues

This section runs through common issues regarding site speed and offers some suggested improvements.

A side note

Somewhere in the world there will be the perfect website run by the perfect web team who have unlimited time and funds to optimise all the different assets found in a website; the BBC News website or Amazon come to mind.

But many web teams, ours included, only have a limited amount of resources (time and money) to dedicate to optimising a website. While we can always make suggestions to clients there is a point where the return on investment is too small or the costs too large that we must accept there is nothing to be gained through continuing to push further enhancements.

Let’s start simple…

Requests & Resources

n. the number of individual requests/resources loaded to display a webpage

A webpage with very few resources usually means a bland and uninviting page. One with many resources may be visually interesting but will load slower. It’s like Christmas, the more Turkey you eat, the fatter you get, and the slower you move.

But for a website it’s not only the added kilobytes/calories that slow you down it’s also the time taken to request and receive that information. Going back to the eating analogy; if you eat a packet of crisps and just pour the bag into your mouth you’ll still eat the same amount, only quicker, than if you eat one crisp at a time, moving your hand from the packet to your mouth each time.

In an ideal world we need to keep the number of requests and the size of the resources low. The richer the media (video, animations) the larger the file size and the slower the page loads. So while long form content pages can be good for SEO, a developer must keep an eye on page load as this can have an adverse effect on conversion.

While we have a certain amount of control over the internal resources loaded on a the page, the external resources, such as adverts, tracking scripts and live chat, can all add far more than we realise. What’s worse is we have no ability to optimise these resources. It’s for this reason why temporary experiment scripts should always be removed once testing is complete.

What can be done

Aside from just removing elements from the page and reducing the size of individual assets it is possible to reduce requests through better coding. It may be that the developers have utilised images to achieve effects that are possible in CSS or that images can be combined into sprites (more on this later) to reduce the number of HTTP requests.

Browser Caching

n. when a browser saves a file locally to negate the need to reload the same file again as a user navigates from page to page

In some cases the server and browser is setup to do this automatically. Sometimes it’s necessary for a developer to add additional code to handle non-specified file types or to extend the default expiry when they expect certain files to be updated less regularly.

In almost all cases it is impossible to remove warnings about caching due to external scripts such as Google Analytics being included on the website. Such resources do not specify any sort of expiry so cannot be cached; and would not work effectively if they were.

cURL and API’s

While we’re mentioning external assets it’s possible with server side languages to gather data from external resources and display them on your website. This might be using an API or alternatively using cURL which retrieves the text output of a specified URL.

When done via a server side script like PHP it’s not immediately obvious that this kind of information is being retrieved yet calling on external resources can really slow down a website as it must first gather the data from this external resource. In all but the most extreme cases it’s possible that this data could be collected and cached either in a text file or database. The benefit is that the website is making fewer calls to external resources and instead accessing local data; which can be accessed more quickly 99% of the time.

Images

Images are a common problem on the web and one of the biggest contributors in slowing down a website. There are various ways an image can affect a website…

Images not compressed

n. making the file size smaller with hopefully an imperceptible reduction in quality

No matter what software you use an image can be compressed into a smaller file size through more intelligent compression algorithms. And no matter the workflow Google PageSpeed Insights will say it can compress an image further still.

At Bronco we tend to ignore these warnings knowing that we have compressed the images as best we can without adding excessive overheads onto the development process. The percentage improvement are normally low enough to allow us to do be happy with the file sizes of our images.

To suggest further compression is needed to a client, the percentage improvements need to be high enough to make it worthwhile or instead we must prove that decreases in file sizes can be achieved with software/processes that are readily available.

Images too large

n. ensuring images exist at or closer to the size rendered on screen

Often Google PageSpeed Insights will return an issue with images because an image is larger than it appears on screen. An image may exist at 800px wide on the server but shown at 100px wide on screen. If the image existed at the size it is displayed it would have a far smaller file size.

However in responsive design it is not feasible to always have an image saved at the size shown on screen as the image may need to resize to fill its container on various sized screens and so should be large enough to fill this container at its largest size.

Responsive images with <picture> and srcset allow multiple images to be referenced to ensure a smaller image is displayed where possible but this still does not return an image for every screen size and so Google PageSpeed Insights will still return a warning. This is unless you fudge the website so the right size image is presented at the exact screen sizes Google test at.

Suggesting moving to use <picture> and/or srcset requires a client with a development team that is familiar with emerging web technologies and who also have in place a dynamic server script for resizing and caching images. As with many new technologies a lack of understanding can lead to an implementation that is worse than what came before.

Image is in the wrong format

n. using the right file format for the job to ensure the smallest filesize and best quality

There is a simple rule of thumb for images…

Never save a photo as a PNG for use on the web unless you must have transparency. The file size for a PNG is just too large with no added benefits in quality.

SVG images are used primarily for flat images such as icons and logos so that these appear crisper on high resolution screens. If these images are saved as JPG or PNG the image often appears blurred.

Image could be a sprite

n. combining similar images into a single file to reduce HTTP requests

Each image requires an individual HTTP request (unless the browser has cached the image) and more requests affect the speed of a site. If some images are common throughout a large part of a website, are suitable to be included in a sprite, and displayed using CSS, then creating an image sprite should be recommended.

SVG’s can also be created as sprites but the opportunity to use these are more widespread as they do not require the image to be displayed using CSS.

Minify & Concatenate JavaScript

n. compressing and merging JavaScript files

This is like image compression but for your JavaScript files. Rather than many files downloaded at a large combined size you get one single file at a slightly reduced size that saves on HTTP requests and loading time.

While this can be a created manually the resulting file contains non-human readable code and so makes future editing near impossible. To suggest this as an option to clients requires a development team that has or is willing to adopt an automated process that will create the minified and concatenated version of the file whilst retaining the original for future editing.

One such workflow is to use Grunt or Gulp (task runners) to automate this process automatically as a developer edits and saves a file.

Mod_pagespeed

n. an apache module installed directly to the server

At Bronco we have mod_pagespeed installed on a number of servers which automates a number of optimisations including the above for JavaScript files. While this takes much of the work out of having to make these enhancements they do not always deliver as small a file size alone as can be delivered by first minifying a file and then also using this module. The same is also true when using gzip (another common server compression module).

I would not recommend suggesting mod_pagespeed to a client who does not have an experienced Systems Administrator to help them.

Minify CSS

n. removing comments and spaces in a CSS to reduce filesize

This is similar to the above but the code remains human readable. However the lack of formatting makes this incredibly difficult to edit on large websites and so should again only be done as part of an automated process.

For leading web teams it’s become common practice to use CSS pre-processors such as Sass or LESS which often lead to minified code as part of the process of outputting browser readable CSS.

JavaScript Frameworks

n. a structure to make writing for JavaScript easier

A vast number of websites include some form of JavaScript framework. The most widely adopted of these is jQuery. With few exceptions there should never be two JavaScript frameworks found in one site; these files are too large and largely perform similar tasks.

The inclusion of two frameworks normally occurs due to a CMS requiring one and a developer/team being more familiar with a second. Either this or the development team is too lazy/inexperienced to understand what they’re doing and install frameworks as a dependency of random plugins they’ve discovered via Google.

One example may be to see jQuery and Prototype in the same site. If two or more frameworks are found in a website and a number of plugins are dependent on them it may not be a simple change to move to using a single framework and require a more extensive rebuild of some parts of a website.

JavaScript Plugins

n. JavaScript files that allow a developer to achieve a specific effect or add an interactive feature

Numerous JavaScript plugins are available meaning it’s not always necessary for a developer to author their own effects. Yet many of these plugins are bloated with unused options which exist so that they appeal to the widest audience.

Not only may a plugin include options that go unused but the developer may use showy effects that require a lot of code rather than simpler effects that don’t because they look good rather than based on how they affect the user experience.

By swapping plugins for lighter alternatives or authoring their own code a developer can reduce the amount of code bloat in a website and make the website faster.

When used in conjunction with CMS’s you will often find these plugins are loaded on all pages “just in case” they are required when instead the code is only required on a single page. Though these files are cached for subsequent visits it can make the loading time of an entrance page greater than necessary.

Render Blocking

n. a resource that stops the browser displaying content on the page until after the resource has been delivered from the server

A browser will read the code of a webpage in a linear fashion and gather additional assets as they come across them. CSS and JavaScript files are known as render blocking as the browser will not continue through the rest of the page until these files have been fetched (this will improve with HTTP/2).

There are three solutions to this…

Async/defer JavaScript files

One of these two attributes can be added to a <script> tag that references an external JavaScript file. Both delay the loading of these files so the browser can continue to render the page. However, depending on how the website is built, the loading of JavaScript after the rest of the page can cause errors.

Add JavaScript files to the footer

This is when you move the <script> tags from within the traditional location of within the <head> tags to just above the </body> tag. This effectively does the same as the defer attribute.

Inline critical CSS

If you inline all the CSS on the page rather than load an external file it is possible to negate render blocking. While Google PageSpeed Insights delivers a warning to suggest doing this, we recommend not suggesting this to a client. At Bronco we’ve yet to find a solution we are comfortable with that maintains ease of editing, the benefits of the browser cache and deals with different entrance pages. As such we can’t suggest something that we do not utilise ourselves.

Content Delivery Networks (CDN’s)

n. a system of distributed servers or domains to deliver resources quicker due to geographical location or by increasing simultaneous server connections

Content Delivery Networks (CDN’s) offer a number of benefits. In one use case they provide the ability to host a website in multiple locations worldwide so that the geographical distance between the server and the user is reduced; delivering a quicker load time.

Another is to provide additional domains from which to load assets which increases the number of simultaneous HTTP requests that can be made (this use case as well as some other solutions above are negated with HTTP/2).

And finally a CDN extends the benefits of the browser cache when using commonly used frameworks/plugins. jQuery for example can be delivered from a Google CDN meaning that if a visitor to a website has previously visited another website using the same CDN file they would not need to download this file thus speeding up the loading of the second website.

Server Response Time & Database Queries

n. the time taken between a browser requesting a website and the server returning the first byte of data

Pretty much any other optimisation suggested in this document is for nothing if the server response time is too high. Google PageSpeed Insights will often return a warning for this when a server takes longer than 0.5 seconds to begin returning data. In this case the ideal solution is to either upgrade the hardware or move servers.

Database Queries

The time to first byte/server response time can also be affected by the complexity of the server side code. While un-optimised PHP code (other server-side languages are available) can have an effect on loading time the bigger culprit is usually complex or multiple database queries.

Seen more readily in CMS’s such as WordPress and Magento which often attempt to load in the entire database on each page it’s also possible for bespoke systems to be created or grow over time to run slowly. It might be that complex database queries are added, multiple queries exist where one would be sufficient or that the underlying database structure is no longer optimised for the job at hand. If the database or the queries made to it are slowing the website down this only gets worse as the traffic to the website increases.

Identifying these issues is difficult using the speed testing tools mentioned above as only a large first byte time would indicate an issue but often this can be more easily solved by hardware improvements.

Optimise Code

n. the process of refactoring code to be as lightweight as required to achieve the same or similar end product

This is simple, the less code used to create a specific effect, the smaller a file becomes and the smaller the site is.

Only a developer will be able to accurately determine what code could be made simpler and whether such improvements would deliver enough of a reduction in code to be worth doing.

However as a website ages and undergoes changes it can be easy for CSS and JS files to become bloated with code that is no longer in use or used in a single instance that could be updated to use newer code that has been written. While CSS Linting is one way of finding out what might be unnecessary often these tools can’t observe an entire website to judge what can and cannot be removed.

Browser Support

n. the ability to reduce the amount of code by removing support for older browsers, and potentially not adopting emerging technologies

The more browsers being supported the more bloated a website becomes. Unless a website has a significant number of users on old browsers it should not be necessary to provide additional code for any browser older than IE10.

Also it’s common for websites to utilise JavaScript polyfills or CSS vendor prefixes to make available emerging technologies that have yet to become part of the specification. While mostly unavoidable these days they add to the weight of a website and should also be monitored.

Lazy Loading Images

n. loading images after all other assets

As mentioned, images are a big part of a website and can slow down the loading of the page. If a website is image heavy this can have a big knock-on effect. So to get the content visible quickly it’s possible to load images after the rest of the page so that the user can quickly access the written content.

There are a number of techniques but (at time of writing) the BBC News has adopted an approach where grey placeholder images can be seen prior to the real image taking its place. Any lazy loading solution requires the use of JavaScript so any critical imagery (logos, product imagery, call to actions) should never be loaded using this technique.

In the tools mentioned previous this won’t necessarily deliver a quicker download time or lighter page weight but will give the user a perceived speed improvement, especially on slow connections.

Responsive Web Design

n. the process of delivering a suitable user experience no matter the screen size

Some developer’s royally screw up websites when making them responsive. As a website moves from desktop, through tablet to mobile a single feature such as the navigation can undergo multiple forms so that it is optimised for the screen size used.

When coding this it’s easy to create multiple versions of the nav and show/hide this at the right screen size. But then you end up with extra code in your HTML and CSS. By utilising the same code more intelligently you can reduce the amount of code and improve loading times. However there will always be occasions where it’s necessary to duplicate elements to achieve a specific implementation, but this is rare.

Conclusion

This is a brief overview of things that can affect site speed and will probably be lacking detail in some areas or contain glaring omissions (because I’ve forgotten something for now).

Optimising for speed isn’t as simple as saying X should be done to improve the loading of a website. In almost all cases there are implications to other areas, not least functionality or ease of future maintenance of the website. While a fast site is an important part of web development it does not exist in isolation and cannot be treated as such.


We'd love to hear from you!

If you think Bronco has the skills to take your business forward then what are you waiting for?

Get in Touch Today!

Discussion

Add a Comment

Get in touch