I was digging into some of the performance issues during my work for a few weeks, and I thought that it can be interesting to see how those could be applied to my site. I was already quite happy with how it performed (being just a small static site) but managed to find a few areas to improve.
Before I’ll go into what I’ve done to make my site faster, I’ll try to describe briefly what I had before:
I do not use any front-end frameworks on it, and the only JS that I have on a client-side is an Algolia search and a Prism.js code syntax highlighting.
I do not have any code-splitting etc, so all the CSS is loaded inside one file and is compressed with CSSO.
I used web fonts — four files for those, and I have used only woff2 format (I’m ok with everyone else getting the fallback font) and used font-display with them.
I rarely use images, and when I used them I pass them through ImageOptim (and now use Squoosh as well).
And that’s basically it.
What to Improve
I was happy with how everything was, but I still experienced a few things where I felt things could be done better: the initial load of the pages, especially how the fonts were loaded, and the consecutive navigation between the pages.
First Page Loads
I would consider my site on the lighter side, with the fonts being the heaviest part. During the last rewrite of my site I seriously considered dropping them for the default ones, but after some prototypes and experiments, I found that with the current minimalist design dropping such a significant part of the whole picture was quite devastating. And I really got used to the font I use right now.
Even though I subbed the font and had used woff2, I could still experience FOUT sometimes, especially for all the headers, so looking into how I could reduce it was a priority.
Navigation Cost, Latency and Browser Extensions
Then, there was that one thing that I’ve noticed when going through my site’s pages and measuring its performance using Chrome dev tools.
Even though my internet connection is quite fast, I still felt that going from page to page was quite slow. After looking at what happens, I have noticed that browser extensions can affect the performance of web pages quite significantly. In my main browser profile, for example, their effect could cause up to 1–1.5 extra seconds before the event would happen!
Browser extensions are initialized for every page you, and the parsing and evaluation of those extensions’ JS can add a lot to the page’s load time.
Basically, on mobile pages, we have latency and slow JS, and on the desktop, even with a decent internet connection, we could have all the extra work happening by all the browser extensions. So I have tried to see how I could manage this as well, as it is not very fun when your highly optimized static site gets slowed down so significantly by outside factors.
I’m using an “instant” preloading of the pages when people hover over them. This is kind of a common pattern these days, and with local Storage, I can make sure those requests wouldn’t repeat and could be useful in the future even if the user won’t visit the hovered pages right away. Similar to clicks, I’m handling this via a bubbling mouseover event, and I do not do any fallbacks for touch interactions, as I don’t want people on mobile connection to download extra stuff and the profits there for this method are a bit lower overall.
As I’m storing everything in local Storage and do not do extra requests whenever I already have the content I need, I need to have some way to invalidate the cache. I decided to have a versions.json (per language) with short sha256 sums for the content of each page. My site is not that large and for my English version of a site this file weights just 1.8kb (even less gzipped), so I can lazyload this JSON on the initial load, making it easy to compare the current versions with those in the cache.
To make sure this kind of dynamic navigation wouldn’t have a too big of an impact on accessibility, I’m using a aria-live=”polite” attribute for my page. In my testing this is enough to get a better result than without it in VoiceOver, but there should be more testing of everything for sure, and I wouldn’t recommend you to use this method for your sites without a thorough testing, as otherwise, you may break the experience for those who rely on accessibility tools. If you happen to notice any problems with accessibility on my site, I would be happy to know about them and look into how I could fix it, of course.
One of the libraries that implements PJAX approach does this without a need for an extra .json file, that means you can try it with much less setup than needed for getting the .jsons.
Overall, I’m pretty happy with the result and can feel the difference, even if the PageSpeed test and similar doesn’t show almost any difference (~100 desktop/~92 mobile, with the web analytics the only part lowering the score right now, oh well).
There are a bunch of projects that implement similar approaches, but I have tried to do everything from scratch in order to understand how everything works, what are the potential problems and use cases for all of the technologies beyond.
Well, that was a lot of information! I’m happy with how both the initial load works on my site, and how the following navigation now feels much more snappy. The JS for the navigation, in the end, is maybe a bit larger than I have anticipated — around 220 lines of code, and it is far from optimal either, as there are joining a lot of edge cases that I didn’t cover. But, for now, I’m very happy with what I have in the end.
I have also learned a lot of new things, a lot of nuances of creating a navigation system like this, and in the future even if I’d choose one of the already built open source solutions, I would know where to look when deciding which one to choose, or what to adjust for the better surfing.
For the current one, I think if I’ll make another pass over everything, there are a bunch of things I would like to experiment on:
Implementing offline and more caching via service providers. I feel like what I did already could help me a lot with this, but some first glances I did at the service workers made me think they’re maybe a bit too cumbersome for my purpose right now.
Implementing such thing from scratch is a thing I highly recommend you to do — whenever you have a task, try to first do the normal stuff by yourself. You’ll learn some new things and would see if your prototype is good enough, or if there are enough edge cases to go and grab an existing solution.
However, try to have some limits over how much time you’ll spend on the site— the less the better, as you wouldn’t want to throw away a lot of work if you’d decide to switch. And you wouldn’t always have a lot of time for experiments — but, even then, I still urge you to try and do them at least sometimes, as it can lead to a better understanding of the field, and sometimes would lead to much quicker solutions for the tasks you have at hand. And all of that can be fun as well, and doing experiments could motivate you to work more efficiently — so try it! And it would be totally ok if this approach wouldn’t work for you — everyone has their own ways to work, there are no perfect ones.
And as a last note: hey, now that my site should be even faster to use through than ever before, why not give it a go and see what I have written in my blog? There are a lot of articles about CSS and various weird experiments, as well as a lot of other stuff!