With growing support for CSS transitions and keyframe animations it’s become more common to see websites adopt a variety of effects, such as fades or motion, on page load to create a more visually dynamic website.
If you look at the direction of the web currently you’d think that as an industry the decision has been made that the web can no longer evolve whilst still accommodating
The 0.2% of UK <noscript> web users is too small a percentage to invest the time and money to provide even the most basic experience to.
I know everyone loves a website that fades in on load but if it means a blank page when JS is disabled is it really worth it?
— Kean Richmond (@keanrichmond) 8 November 2018
For web applications using React et al the conversations required are far more complex, however for websites that are built more simply it’s a genuine question as to whether the benefits of these visual flourishes outweigh the disadvantages, especially when the greatest of these is a completely blank website.
Disclaimer: Without understanding the requirements, limitations and a whole host of other factors I can’t say that a website/agencies decision to not provide a fallback for
<noscript> users is justified or not. Only that this decision, or lack of same, will negatively affect some users critically.
This problem has been swirling around my head for a while, but I never seemed to nail down a possible solution… until now.
We’re then using
So far so good. But throw in a defer attribute and we‘re back to getting the flash of text we need to avoid.
Note: Using a throttled connection, and adding a number of images into your test page, is the best way to test if you’re getting this flash.
So our class switcher needs moving, in this instance to the top of our
<head> to ensure it runs first and without delay. An external render-blocking file could be used but at a few lines of code, inline seems like the way to go.
This works great.
Note: There’s two points in the loading sequence we can enact our animation. This is either when all assets of the website are loaded, or when the DOM Tree has been loaded (and before external assets, including images, have). Which of these you chose will depend on if having external assets not loaded at the point you start your animation is going to cause problems.
I believe we have a solution for
Initially placing this at the bottom of the website makes sense as a way to collate similar code in one place. But as we’re trying to find a solution that is fault tolerant there’s always a risk that the code could break at any point between our rendered element and the bottom of the file. While these are the kinds of errors usually caught and fixed in development, let’s run with this.
Rather than have our function that adds a .loading class into our content in an external file, we have created a short function in the
<head> of our file which we call directly after our element has rendered almost guaranteeing this to run.We now have a solution that covers
<noscript> users, external JS file failures and server side errors that may cut our page in half.
Remember that I mentioned some browsers will not support CSS transitions or Keyframe animations? In our examples we’re using keyframe animations to make our elements appear, and when a browser does not support these we’re left once again with a blank page.
Yes, we’re really playing some hard core hide and seek with this element over its lifetime.
The impact of working this way is that we can no longer use
animation-delay as this will create a noticeable flash of the content appearing. But with keyframe animations we can repeat the initial hidden state, in this example that’s at 10%, which works out at 0.15 seconds.
The code you’ll have seen above is a very simple example, written and tested in no more than a couple of hours and won’t necessarily work in all use cases. It certainly doesn’t take into account user scrolling, though the Interaction Observer API may offer a suitable solution that integrates with the above code rather easily.
What I’m getting at is this is just an experiment, not yet used in a rich and fully functioning website and certainly not some ground breaking solution. So far it certainly seems like a workable and better solution than ignoring progressive enhancement altogether. But it’s simplicity has me thinking there’s either a gremlin I’ve yet to spot, or I’m observing a problem that isn’t as widespread as it appears to me.
Those who dig deeper into how browsers load and render pages may take issue and find fault with this game of hide and seek being played or know of instances where the content may still flash into view when we’d worked to avoid this. But for now it seems like a pretty solid solution, relatively simple to implement at small scales.
If you do have anything to add to the conversation, such as a better or more tested solution, leave a comment below.