It's unbiased toward either the beginning or ending value.
If you want to overstate a change, you'll use the smaller value as the denominator. If you want to understate a change, you'll use the larger. If you use the mean of the two, you're giving a value that's not biased by either of the two extremes.
When talking about a starting value and a new value, I find the percentage change formula on the page you referenced to be the most logical. When considering a change, your frame of reference is usually the original value.
I am not sure what this accomplishes, other than making the stats look better in their monitoring tool. They pretty much seem to admit this:
"Conclusion: the loading, parsing and executing of the async third party scripts are quite often very slow and this messes up our End User performance stats in New Relic."
Delaying the dynamic insertion of scripts until after the onload event, doesn't change anything. All the external crap will still be loaded. It might even make the external assets load slower.
Another solution would be to choose a monitoring tool that distinguishes between DOMContentLoaded (or the equivalent in non-Gecko browsers) and the regular onload event.
It helps the end user experience if you want to have some javascript run after the DOM is loaded but not wait until all the 3rd party scripts have loaded and run. Maybe the javascript runs some sort of animation.
Without this code, the animation would wait until facebook, google + and twitter are all loaded, but with this code, the animation would start right away.
If a script is supposed to run when the DOM is ready but external scripts/stylesheets/etc are still being loaded, it should not attach itself to the load event. The load event is supposed to run when the page has fully loaded. By the way, IE9 now supports DOMContentLoaded and there are hacks to simulate it in earlier IE versions.
I'm a little perplexed by the reactions to this research. Speeding up the execution of javascript tied to the onload event might not be critical for every site but understanding how these pieces fit together is always helpful.
> The question is: does this 1.14 seconds better reflect the user experience? In our opinion, yes. At that 1.14 second mark, all content has finished loading and is displayed on screen. The only stuff that comes after that are the aforementioned social sharing buttons and the Twitter Tweet Box (and on a blog post page, the Disqus comments would also come in late).
> There was a problem loading Disqus. For more information, please visit status.disqus.com.
I wasn't able to comment because I'm on a slow connection, and the rest of the content hadn't finished loading by the time I had finished reading the article.
Being WPO consultants we of course made a big effort to make CDN Planet fast. Our goal: the average page page load time must be <2 seconds for 95% of all page views.
Isn't that a bit high? cdnplanet is actually impressively nippy for me, I'd just expect them to aim for 1.5 sec.
1.5 seconds for 95% of traffic is very ambitious, considering the users are geographically disperse. Say someone from across the globe(RTT=300ms) visits the site, it would take them minimum 600ms to even start rendering the page...
Seems nothing is good enough anymore! I give up. But seriously, all this does is serve to make you feel better when looking at metrics and barely puts a dent in real user experience. You're still blocking, you've still got the same load times. The only only difference is that you're delaying the inevitable and doing so doesn't always help user experience a bit. Waiting is waiting whether you wait now or later.
The difference is that earlier, if there is a load spike in monitoring, we dont know if it was cause of 3rd party scripts or some problems with out server/CDN. the former we can live with, but the latter is very bad.
Now after deferring the 3rd party scripts, we know for sure what the user experience is with regards to the main content of the site.
Ok. Fair enough. I never thought of it that way. I stand corrected. I'm tempted to argue that some third party scripts may be necessary for presenting content and delaying them can mess up a user experience. But we all know not to do that so that point is moot however there are excepts, however few, to every rule.
All that aside, though I'd say I just changed my mind to agree with you.
...this technique begins downloads immediately, but still avoids blocking window.onload (just ran a brief test myself to verify this behavior). Unfortunately, it requires the third-party provider to adopt LightningJS. Here's to hoping that more providers take notice :)
> "The only only difference is that you're delaying the inevitable and doing so doesn't always help user experience a bit."
"Always" seems like an unnecessarily high bar. If any meaningful fraction of your users would be better served by getting readable content in 2 seconds and having to wait 10-90 seconds for third-party features, I would absolutely say this is definitely a worthwhile optimization. Even if it leaves some speed-reading twitterer waiting the same 10-90 additional seconds for those features to load and render.
This is a pet peeve of mine. Percentage changes should be measured using the original value as the denominator, not the new value.