I feel like the 1MB limit is excessively generous, especially for text-only pages. But maybe that's what makes it so damning when pages fail to adhere to it. I know at least one website I maintain fails it spectacularly (though in my defense it's entirely because of that website being chock-full of photos, and full-res ones at that; pages without those are well under that 1MB mark), while other sites I've built consist entirely of pages within a fraction of that limit.
It'd be interesting to impose a stricter limitation to the 1MB Club: one where all pages on a given site are within that limit. This would disqualify Craigslist, for example (the listing search pages blow that limit out of the water, and the listings themselves sometimes do, too).
I also wonder how many sites 1mb.club would have to show on one page before it, too, ends up disqualifying itself. Might be worthwhile to start thinking about site categories sooner rather than later if everyone and their mothers starts spamming that GitHub issues page with sites (like I'm doing right now).
Gives you an emulator, an 8-bit AVR Assembler, and an IDE for just over 500k transferred. Almost all of it JavaScript.
Using math.js is by far the heaviest part but at least your Asm already knows things like solarMass and planckConstant :-). CodeMirror comes in second heaviest but for user-experience-per-byte you're always going to be streets ahead of a gif.
At the risk of over-complicating things, perhaps there could be limits per resource type. 10Mb of images might be reasonable (e.g. for a photojournal), but only 128KB of JS, and 128KB for everything else. Something along those lines.
Yeah I was surprised they included pictures in the limit at all -- I mean, sometimes, you need those pictures, and for them to load slower is less important so long as you don't need them to navigate the page.
If you were able to calculate the space in the document flow for those images, I'm fine with the lazy loading. I hate when the page text appears rendered long enough that I start reading, but then lazily loaded items cause the flow to rearrange the text so that I lose my place.
I imagine it should be pretty easy with javascript to set up dummy images, and replace them... probably also doable in pure css, just make them a block element or something, with a set size...
You don't have to imagine everything. Normally the lazy-loaded image should be seeded with transparent SVG that as the same dimension. It's a solved problem.
It's over one hundred thousand lines of JavaScript code minified and Gziped into a 300KB bundle which should fully load in about 300ms on a decent computer.
It was about the same for me, and I used a simple stopwatch. Stopped the watch as soon as I saw anything on the page. I am on a fairly fast network, too
Even the page where I used images came in at juuuust about the limit -- to the point where you'd have to make a ruling of whether the rest of the page gzipped counted because over the wire was technically <1MB but it was just above after decompress. The site does specifically say "downloaded", haha
Personally, I stuck to the 40K best practice recommendation for the basic web page as a target, which was in place when I started. Modified to up to 140K including webfonts. Notably, this is without images.
(This is a flexible target, depending on the complexity of a page. E.g. for a "bloated" page, a fully styled video display for competition winners showing 200+ entries and 280+ individual videos in categorized views is about 250K, including a few images, two and a half font families and the Vimeo Player SDK, but excluding the load of any external video streams. However, with compression we still manage the 140K mark.)
Then reality hits: Client insists on a full-width photographic hero image as it's still 2014. Usual controversies about a full-size intro video (autoplay, of course), we must have this highly intrusive chat asset installed, etc, etc… – And we easily blow the 1MB limit.
Something about all pages being under limit the limit instead of every page being under the limit changes the exact meaning to something that I cannot agree with. Which was the meaning I replied below and then wrote this after realizing you might of meant every when you wrote all.
Lets say you write a daily blog. A single A4 of text contains on average 3000 characters, your posts average slightly above that by being 4000 characters.
How long until the text content alone is above the 1MB limit.
I doubt you can find a single text blog on a specific topic that wouldn’t be improved by limiting it’s total text to 1MB.
Being more verbose is generally just poor writing. Now, using a separate website per topic seems like a silly limitation, but the more topics being discussed the less relevant the discussion.
I think an onload limit is much more useful than file size.
A 700kB JavaScript page can take up to 10 sec. to render on older mobile devices.
And a 500kB image can contain megapixels which will slow down non-PGU browsers.
Personally I always go for a max 2 sec. limit on all devices.
I think it should be relative to content. As in versus actual text on the screen. This metric can also be applied "per-page" and "per-site", with less ambiguity for SPAs; every new load brings in more bytes, but also more text, thereby contributing to the ratio.
Even better - fixed site-wide assets (i.e css, js) to features; hence loading an entire framework only to use a small % of it's features is penalised.
I feel like the 1MB limit is excessively generous, especially for text-only pages. But maybe that's what makes it so damning when pages fail to adhere to it. I know at least one website I maintain fails it spectacularly (though in my defense it's entirely because of that website being chock-full of photos, and full-res ones at that; pages without those are well under that 1MB mark), while other sites I've built consist entirely of pages within a fraction of that limit.
It'd be interesting to impose a stricter limitation to the 1MB Club: one where all pages on a given site are within that limit. This would disqualify Craigslist, for example (the listing search pages blow that limit out of the water, and the listings themselves sometimes do, too).
I also wonder how many sites 1mb.club would have to show on one page before it, too, ends up disqualifying itself. Might be worthwhile to start thinking about site categories sooner rather than later if everyone and their mothers starts spamming that GitHub issues page with sites (like I'm doing right now).