Am I the only one who (still) does not feel comfortable seeing JavaScript being intertwined into the so-called "vanilla" web in a way that seems more like a hard dependency and not the progressive enhancement we were taught should be the approach for serious public websites?
The page https://plainvanillaweb.com/pages/sites.html uses custom components for all code examples and embedded content. Without JavaScript, it merely shows "enable scripting to view ../XYZ.html" in place of all code examples and demos. Better than having no fallback at all, I suppose, yet still not "navigable".
The fact that it does not even bother to build these custom components on any native element with a similar purpose—like, say, a link that users could follow to see the text document (*), or a plain old iframe (**)—is grim.
Web components are indeed useful for prototyping and personal testing, but are they really beyond the threshold where it is safe to serve them in the wild, potentially harming some users?
(*) I know, view-source: links and embeds are sadly blocked by browsers nowadays. Makes very little sense to me. Someone likely managed to exploit it for some nasty purposes, so now we are "protected", I suppose.
(**) See? In the olden days even iframes were said to have a link fallback inside, for user agents that did not support them.
It depends on the context. What you say makes a lot of sense for web sites, but expecting a web app to use JS strictly as a "progressive enhancement" is IMO unreasonable.
Now, this particular website is indeed a proper website, and so it shouldn't need JS to do its thing. But it's also a website that advocates for a certain way of developing web apps (even if they don't use such terminology themselves), and as such, is essentially a demo for the same.
This is quite fair point, yet I'd argue that still most so called web apps could (and should) use basic old-school HTML forms as the underlying technology, and progressively enhance from that baseline, up to the "app-y" look and feel we know and ~~ha~~__love__.
Obviously, there are some limits where application built with bog standard HTML forms becomes too cumbersome or makes no practical sense, as you say, but I think that that threshold is far higher than what current web-app landscape exhibits. I think the threshold is around video editing software, or real-time multi-user collaborative spaces perhaps. But for the rest, following three old steps
Make it work — just HTML.
Make it nice — add some CSS.
Make UX slick — add JS.
still makes sense to me. In context of the plainvanillaweb.com it would mean just moving the content from non-semantic attributes of custom components to their semantic initial content, such as adapt
You're not the only one. I block most subresources by default and was disappointed to see empty figures peppered throughout their articles. I'm learning to not automatically equate advocacy of "vanilla" with advocacy of robustness.
The page https://plainvanillaweb.com/pages/sites.html uses custom components for all code examples and embedded content. Without JavaScript, it merely shows "enable scripting to view ../XYZ.html" in place of all code examples and demos. Better than having no fallback at all, I suppose, yet still not "navigable".
The fact that it does not even bother to build these custom components on any native element with a similar purpose—like, say, a link that users could follow to see the text document (*), or a plain old iframe (**)—is grim.
Web components are indeed useful for prototyping and personal testing, but are they really beyond the threshold where it is safe to serve them in the wild, potentially harming some users?
(*) I know, view-source: links and embeds are sadly blocked by browsers nowadays. Makes very little sense to me. Someone likely managed to exploit it for some nasty purposes, so now we are "protected", I suppose.
(**) See? In the olden days even iframes were said to have a link fallback inside, for user agents that did not support them.