3 of the Most Common Site Speed Conundrums and How to Fix Them
With Google rolling out their mobile first index next month, it’s more important than ever to ensure your site loads rapidly even on 4G and 3G connections. But this is easier said than done, despite powerful free auditing tools such as Google Lighthouse, it can prove difficult to implement even some of the simpler changes without a certain level of development knowledge. Here at Edge45® we implement site speed focused updates for clients regularly. The following are the 3 most common issues we run into and how we fix them!
Some screenshots in this article are taken from Google Lighthouse audits. To learn how to run your own Google Lighthouse audits, check out this tutorial.
Sever Response Time
By far the most common issue our audits bring up is server response time and is usually due to cheap shared web hosting!
If you’re paying under £15 a month for a shared hosting package from any of the most popular web hosting providers in the UK, you’re likely on a server with hundreds, if not thousands, of other sites. This may be hidden behind buzz terms such as ‘cloud hosting’ and promises unsubstantiated promises such as ‘crazy fast websites!’, but don’t be fooled, the hardware you are on is cheap and the server optimisation will be tacky!
On the off chance that your web host provides a decent shared hosting service, you’ll never have any guarantees regarding the consistency of a server’s performance. For example, if a news site on your server posts a juicy new scoop about a Love Island contestant, the massive influx of traffic to the article will not only affect the host site’s performance, but your site’s performance too. You’re never more than one step away from disaster when you choose shared hosting.
But say you’re on dedicated hosting or you’re absolutely confident in your shared hosting package and you’re still getting pesky server response time messages from your audit. What could be the problem?
Well, server response time is not only determined by the raw performance of your server hardware, it also depends heavily on the optimisation of the back-end code. For example, your server may be making too many requests, or might have to complete an unnecessary loop. Sometimes back-end optimisation issues can be fixed by a simple plugin (if you’re on a popular CMS). For example, Autoptimize for WordPress has been effective for us in the past. In most cases however, you will have to talk to a back-end developer about your specific situation.
Unoptimized images on your site will completely ruin your performance. Guaranteed. There is so much to the art of image optimisation that Google splits the topic into four separate sections in their lighthouse performance audit. In the below example, images have not been optimised at all on the site, resulting in an inordinate increase in page load time.
We’ll now go through what each of these sections mean, and how to fix them.
Offscreen images – images that load as soon as you load up a new page but are below the fold (i.e. you need to scroll down to see them) and may be stopping other, more visible, elements from loading.
The best way to fix this is to implement what’s called ‘lazy loading’. All this means is that images are requested from the server only when your browser scrolls down to them. This means most of the images on the page aren’t loaded as soon as you reach the webpage and only do so once you’ve scrolled down the page and they are needed.
In terms of implementation, if you use a popular CMS you’re in luck. On WordPress for example, the plugin BJ Lazy Load will handle the lazy loading of your images automatically. If you’re on a custom CMS, you‘ll have to get your developer to implement the necessary changes. Google has a great tutorial on this which you can find here.
Properly size images – images that aren’t properly sized are stored on the server in a larger resolution than they are displayed on the webpage. This means that the browser spends loads of time loading your high res images, just for them not to be used to their full extent on the page.
To properly size images you’ll have to take a look at your website and figure out the maximum resolution each image displays at. You can do this through inspect element. Don’t forget to check all browser resolutions, some images may even get larger as your shrink your browser window! After you’ve figured out the maximum size each image displays at, simply reupload the necessary images at their new resolution.
If you want to take it a step further, you can ask your developer to load different versions of one image dependant on the resolution of the device accessing your website.
Serve images in next-gen formats – Google has made the decision on everyone’s behalf that JPEG, PNG and SVG images are no longer good enough for the web and is pushing the use of next gen formats. Next gen formats are simply more modern versions of the current formats we use every day. The main reason Google loves them so much is that their compression is far more effective than traditional formats, meaning file sizes are smaller and they load in browsers quicker. The three main next gen formats currently in use are JPEG 2000, JPEG XR AND WebP.
Implementation of next gen formats is far from simple however. To convert your current formats into WebP for example, you’ll need to complete a reasonably complex download process of some software, which you can only use through the command line/terminal. To make matters worse, there is no next gen format that works on every browser, meaning you’ll have to implement many different images formats conditionally based on the browser the user is using. If this all sounds too difficult you’d be forgiven for skipping over next gen images, at least until there’s a universal format!
If, however, you’re insistent on perfection, we suggest hiring a developer to oversee the conversions and the conditionals.
Optimise images – By ‘optimise images’ Google just means compress images. We’re not really sure why this section isn’t just ‘compress images’.
There are many tools around the web you can use to compress images. Our favourite is imageresize.org, which compresses both PNG and JPEG.
Render Blocking Resources
A render blocking resource is usually a large script or stylesheet loaded near the top of a webpage. The issue with such scripts or stylesheets is somewhat like the issue of offscreen images, the user is waiting for something to load that they can’t see. What makes render blocking resources worse is the fact they’re loaded so early on a page. Nothing else on the page will be able to load whist these resources are loading, including above fold content. This goes against the main principle of load order optimisation: whatever the user can see should be loaded first and foremost.
Load any files that the above fold is dependent on – At the bottom of the head, after the meta information, you should load any scripts or stylesheets the above fold is dependent on. This usually entails anything you’ve preloaded at the top of the head. So that preloaded jQuery dependency, load it for real now. Same goes for that CSS that’s for your above fold elements, load it for real now.
Load inline styles/load stylesheets asynchronously – In our opinion it’s best to only load CSS just before the point it’s needed when we’re talking about anything below the fold. Many people defer CSS right to the bottom of the page, which is fine too in the eyes of Google, but to your user it results in a jumpy mess of a page load, something that definitely doesn’t give them confidence in the legitimacy of your site. This means you are left with three options. Either load your CSS inline within a style attribute (you can read more about this here), load your CSS inline using a style tag (you can read more about this here), or you can load a new CSS file from within a script at the bottom of the body. We’ve detailed each of these implementations below. It’s important to remember that at this point, we’re only talking about CSS affecting elements below the fold.
Firstly, let’s talk about implementing CSS in line with a style attribute.
Although it is faster on small webpages than traditional external CSS, this practice is generally looked down upon as messy and unscalable. If you implement CSS this way on a larger website, you will soon end up with a whole lot of duplicate CSS which will render that slightly faster browser processing time obsolete. Honestly, we’d be very surprised to see an effective CSS implementation using only this method.
The second method involves loading your CSS inline using a style tag and is a far neater and more scalable way of accomplishing the above.
This method allows the use of classic CSS structure, meaning you can distinguish between classes and IDs, removing the need for duplicate CSS. You can use this tag as many times as you want within the body to load CSS, (and don’t let any HTML know it all tell you otherwise). As we’ve already said, this method makes many improvements on neatness and scalability from the last, however it can still get unmanageable if you’re not using some sort of framework structure. There’s also certain best practices in terms of implementation to future proof your site against things like the scoped attribute, but that’s a whole other article. If you’re not in the 1% of websites built on a modern, dynamic framework, this next option will be the one for you.
So, there you have it, three of the most common site speed conundrums and how to fix them. Now every site is different, it may even be the case that the CMS you use limits or downright blocks the implementation of these methods. But never fear, there are always alternative ways to improve your site speed no matter what your predicament. If you want full site speed optimisation, or maybe just an audit to point you in the right direction, feel free to contact us and get the ball rolling.