Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why I'm turning JavaScript off by default (tommorris.org)
45 points by Isofarro on Dec 10, 2013 | hide | past | favorite | 72 comments


The first thing that would fail is the ability to login to his own site, which uses Mozilla Persona and requires JavaScript.

To be fair, I'm very much in favour of reducing the amount that the browser does and to send less stuff to the browser.

I'm in favour of Tom's whitelisting approach and I used to believe this was the best way to consume the web. But that was back when most sites still had an idea of gracefully degrading and continuing to function.

Too much stuff breaks now, and I find that a blanket policy no longer works. Instead I user a combination of Disconnect, hosts files http://someonewhocares.org/hosts/zero/ , private/incognito browsing, to block per-site, to prevent bad stuff getting through, and to prevent sites abusing my use of their site (aggressively tracking and marketing).

But... by and large I now leave core JavaScript enabled. As too much breaks.

Some things will break any way (Omniture code that captures an onclick and breaks the anchor by default unless their JS is enabled). I feel that my setup is a pretty good median point though, between blocking junk and having the web remain usable.


> The first thing that would fail is the ability to login to his own site, which uses Mozilla Persona and requires JavaScript.

That's why I'd whitelist my own site. And the only person who that would fail for is me because I'm the only person who can post to my own site. :)


Yeah, sure.. what about the twitter button from articles or jQuery library included in your pages?


The jQuery is so I can progressively enhance the post form. Which you can't see unless you are me.

And the Twitter button doesn't use JavaScript. Very much intentionally.

Also if you think I'm being hypocritical by using JavaScript a tiny amount on my site while also decrying it's overuse, then you've failed at reading comprehension.


So, you admit that you are loading a huge javascript library even if you visitors don't need it.

And you tell others that their browser loads lots of unneeded/unwanted scripts? Are you serious?


No, I'm both a lazy and fallible human being who hasn't gotten around to fixing it. I'm not exempting myself from my own critique.


Out of curiosity, how is Persona for this use-case?


Works great. I chose it because (a) I couldn't be arsed to write a username/password setup and (b) using things like Facebook/Twitter requires an API key and I can't be bothered.

Optimizing for laziness, Persona sort of won out.


That's exactly what I was hoping to hear. I often build one-off web applications (server-side rendered, not JS-filled stuff unless its needed heh) and I hate handling the user/pass setup and flow. Cool to know that Persona will work well for that sort of thing, cheers!


(Many/most) Publishers want to be able to monetize and create complicated sales funnels and all that sort of stuff.

(Many/most) Web users just want to "read a goddam text document".

This tension is unlikely to be resolved.

Of course, some useful things on the web truly are 'apps'. Text isn't, and probably never will be.


Adblock solved a lot of previous "tensions". ;)


Agreed!


This tension is unlikely to be resolved.

For users, there are technical measures that are/could be implemented in the browser. White lists are inconvenient but their inconvenience is to be weighted against the scripting inconvenience. Also sandboxing.

With an easy and friendly interface, a growing number of people would turn to disable JS to browse all but their favourite sites. Attracting users (and thus making them white-list you) could be an incentive to develop accesible welcoming front pages.

That would press commercial sites more like apps and would des-incentivize personal or non-profit pages to be heavy.

So I believe there're ways to resolve tension, it depends on how inconvenient and unsafe web scripting becomes.


Cross domain requests are the real problem. Usually, allowing "example.com" to run javascript fetched from "example.com" is not a problem. It's when "example.com" fetches javascript from 10 other domains that the web starts to behave horribly.

RequestPolicy for Firefox blocks all cross-domain requests. Even fetching images and css from different domains. It has a blacklist/whitelist system very similar to NoScripts, so you have to allow for example, twitter.com to fetch resources from twimg.com etc, but you only have to allow it once.

[edit] I use both NoScript and RequestPolicy, but I consider RequestPolicy to be a much more powerful and useful addon.


Thank you for mentioning the RequestPolicy, I didn't know the extension. Blocking/allowing access "from x to y" seems very useful.


Everything about this article reads like it was written in 2011. "elements that aren’t div exist primarily as a nostalgic throwback to a gentler era." Um, have you looked at the HTML5 spec lately? Or used Bootstrap? Or any other significant CSS framework developed in the last 2 years? It's div/span class adornments for structure that's on its way out.

And this, incidentally, is why NoScript is a terrible idea: the web is evolving. As new standards are proposed and targeted, JavaScript polyfills to support new baseline HTML/CSS features are a necessity for sustainable future-and-past-proof development in the modern Web ecosystem. Turning off JavaScript everywhere because you don't like experimentation and want to pretend the world froze in your golden years is narrow-minded and bullheaded.


  JavaScript polyfills to support new baseline HTML/CSS
  features are a necessity for sustainable future-and-past-proof
  development in the modern Web ecosystem.
They're very important, but your site should still be usable without them.

  Turning off JavaScript everywhere because you don't like
  experimentation and want to pretend the world froze in your
  golden years is narrow-minded and bullheaded.
More like the other way round: abandoning best practices so that you can play with a shiny new framework is what is bullheaded.

If you want to play, do it on a side project or in your spare time. Don't do it in production code unless you can truly justify it.

Every public facing site that I write works reliably in IE8 without JS. From that base line the experience only gets better until it's every bit as good as the site that abandons IE or no-script agents.

Now, guess whose site is search engine friendly from the outset; guess whose is more user friendly because it more closely aligns to expected browser behaviour; guess whose has a lower load time because the server can deliver a pre-rendered HTML page to the browser; guess which one doesn't just die because ad-block was installed?


> JavaScript polyfills to support new baseline HTML/CSS features are a necessity for sustainable future-and-past-proof development in the modern Web ecosystem

I'd wager that most people who disable Javascript are using a modern browser with a good blocklist/whitelist system, e.g. Firefox with NoScript.

For instance, if your website needs Flexbox, your CSS will work even on Firefox 25 with NoScript. Flexbox won't work on IE8 with JS off, but that's honestly OK. Progressive enhancement will still allow the visitor to read the content and interact with the page, the only issue is that the elements won't be positioned as intended by the designer. If you're willing to invest more time to support even IE8 with JS off, you could use Sass to more or less replicate Flexbox with floats etc.

Usually, until a CSS/HTML feature is supported by at least two major browsers, it's not a good idea to use it in production.


But there IS a distinction between web site and web app. Blogs, news, forums and the like should not be web apps, they shouldn't use client side rendering and fancy widgets. The Spotify web player however IS a web app, and it would suck tremendously if it wasn't built as such. Can we stop with the black and white reasoning already? Yes, many things that claim to be web apps today shouldn't be, but that doesn't mean there aren't legitimate uses for it them.


Spotify desktop app is awesome. There is no real need for a web app.


Elements of the Spotify desktop app (though not all of it) are built using web technologies. They use the Chromium Embedded Framework: http://en.wikipedia.org/wiki/Chromium_Embedded_Framework#App...


Except that web apps offer a lot of conveniences for both end users and developers. You never need to update your software on an end-users machine (or write complicated auto-update code) and you can be guaranteed that it will work on any random computer you come across without needing to install anything.


Well, I can only install certain apps on my work computer, and Spotify is not on the CON, but I can use the web app (and is fine here to listen to music while working). I'd argue that there are valid use cases for a web app.


In my experience, the Spotify desktop app has spotty[1]-at-best linux support, but the webapp seems to do fine.

[1] HAH!


I am much of the same mind as the author, and for a very long time my NoScript was configured to disallow all JS by default.

Sadly things have reached the point where this is just not realistic anymore. Too many websites just fail to work altogether, and I was spending way too much time whitelisting sites. And honestly, who has the time to check up on what each and every one of those is doing with their crappy JavaScript?

Reluctantly I've been forced to enable "Temporarily allow top-level sites by default". This way most sites work somewhat, except when a bunch of their JavaScript is delivered from their CDN - then it all breaks again. At that point I'm usually happy to just bounce from the site, but it is annoying.

In combination with a cookie-eater plugin, I have temporarily found a balance between keeping the web usable (FSV) and eating my own time micro-managing stuff like cookies, permissions and JS, but it's an uneasy truce.


> Sadly things have reached the point where this is just not realistic anymore. Too many websites just fail to work altogether, and I was spending way too much time whitelisting sites.

I have a very simple solution to this: website breaks and has no legitimate reason for using scripts? Good bye website. I don't do web development, so worst case scenario is website breaks so I get my ass back to work sooner.

I know one person is too little to make a difference, but I don't care. I'll stick to it for as long as I can, and continue to deprieve the owners of such badly designed websites of any revenue from my side.

It's the same with any media, really. I sift through a book and find it's badly printed and barely readable? I don't buy it. For instance, it's stuff like this that gets on my nerves:

> Thanks to frameworks like Angular and Backbone, you can build applications that contain no data in the HTML document at all. Hypertext without any actual hypertext.

If there is no hypertext in your application, why the fuck are you using a hypertext protocol in the first place? Just like I wouldn't buy an electric car that requires me to purchase gasoline just because, you know, it's a car, so you need to buy gasoline with it, even if you don't use it (because the motor doesn't run on gasoline!), I simply don't use an application that runs on top of HTTP, but has no, um, HT.

Edit: I do think there's a lot of strength in the crowd here. 90% of the websites that break with disabled JS break because of poor/lazy programmers or because of script abuse, that's trying to siphon as much data as possible. If users demanded a minimum level of quality from the WWW and refused anything below them, the designers and website owners would eventually have to cope.

Unfortunately, the Internet literacy of most Internet users is basically inexistent, so there is little to build upon when asking for quality standards.


  I have a very simple solution to this: website breaks and has no legitimate reason for using scripts? Good bye website.
This works quite well for me as well. I found this was the case back when I was using Linux before Firefox was popular, and it's still the case when I use my Windows phone now.

I think there is still hope.

The proliferation of mobile devices now means that cross-browser testing is nigh-on impossible, so this makes progressive enhancement more important now than ever.

The improvements to CSS have made it possible to build a lot of cool functionality straight into the website.


This is why I'm keen on some kind of web of trust for this sort of stuff.


Well, white-listing probably sorts out a big bunch of problematic sites.

My problem aren't pages like facebook or google, which I use everyday (even if facebook is kinda overloaded) but more the pages I don't know :\


OT: Do you have an email address? My work net filtering is blocking this blog for a bizarre but somewhat troubling reason.


Yep. tom [at] tommorris [dot] org

I've been filtered in the past because of LGBT related content on my site. A friend told me their school filtered it for the stated reason of "advocacy of immoral lifestyles". Which is hilarious.


Isn't it very ironic to read, buried within comments about the perils of javascript, about some more serious threat to the web ... [omitted question mark]


Why I'm turning "run attachment in emails" off by default.

No script is simply the sane alternative to treat JS as untrusted software trying to run on your device. Its a bit inconvenient, but thats the trade you have to make if you want to retain control. The other alternative is that web browser developers work towards a more feature rich Style sheet language, say CSS, So JS Turing completeness can be exclusively used only when it's needed.


Angular and Backbone applications aside, doesn’t this just boil down to bad development and lack of understanding?

I’m a firm believer that the web should be usable with JavaScript disabled, JavaScript is there to add a layer of interactivity to your web page, not core function. All those people that keep coming up to me quoting ‘but everyone has JavaScript enabled’ miss the point - the web needs to be accessible and JavaScript isn't the answer.


>I’m a firm believer that the web should be usable with JavaScript disabled, JavaScript is there to add a layer of interactivity to your web page, not core function. All those people that keep coming up to me quoting ‘but everyone has JavaScript enabled’ miss the point - the web needs to be accessible and JavaScript isn't the answer.

Accessible to who? Because most real life screen readers and such perform quite well in actual tests with JS pages.

>JavaScript is there to add a layer of interactivity to your web page, not core function.

Again, says who? Why should I have to write my web code with twice as much effort, once with all the nice things I can do with JS and another for the 1% of users without Javascript?


I'm almost in the same camp as the author. I don't particularly enjoy a lot of the front-end fancy gimmicky that goes on in websites, but at the same time, I do appreciate some of the UX improvements it affords us. Client-side validation makes my life easier, UI elements like sliders and trees are very useful, and so on. What I would like is a middle-ground - NoScript implementation that could switch off JS based on particular client-side events - turn off everything that listens to onScroll and onHashChange, but leave things on input element focus/blur and form submit events for example.

But, as a relatively experienced web developer, I realise that would break pretty much every website ever. Shame really.


I was filling in a form yesterday and the client side validation blocked tab completion. Yay.


In case anyone feels like complaining about Persona depending on JavaScript: yes, it is true that it depends on JavaScript at present, but that JavaScript is only a crutch—a polyfill for something which is intended to undergo standardisation and formalisation as core browser functionality. Assuming that happens, Persona-based sites will thus in the end not depend on JavaScript.


I don't think his point was that client-side scripting is bad, just that there is way too much of it and the way some people use it change fundamental functionality.


"that JavaScript is only a crutch—a polyfill for something which is intended to undergo standardisation and formalisation as core browser functionality."

Thats excellent. However if the poster requires behaviour that's built into the browser, he may as well require behaviour that's supplied by the site.


From the article: "The only person who is significantly injured by turning off JavaScript on my site is me, because it’s needed for the login system and posting UI".

To me this is more sad than the tenure of the article as JS should not be needed to do this at all, I do not need that on my website.

I also disagree with making everything uniform in usage as it brings boredom to the field, for me most exiting times in a field is when the uniformity hasn't kicked in yet, of course my opinion is personal.

I only use NoScript to safe my ass from scams and nasty people trying to encrypt your stuff and extort you for it, and it does it's job well for that.

On my website I mainly use scripting to bring luxury shortcuts for the people logged in, it works without all that for the outside world.


There may be stuff I add to my site that needs scripting. And I may take the scripting out for the login.

The posting UI is just for geolocation which you can't do without JS.


Nice for you to reply on my opinion, this tells me to take you serious even though we disagree on things.

I do wonder why one needs geolocation for an open website (where one has not logged in), and are there really no geolocation systems on serverside that you can use?


I use geolocation to tag my posts with my location. I'm trying to build my own Foursquare using OpenStreetMap data.


Tl;dr: client side scripting is not the problem; client side scripting abuse is the problem.


Chiming in...

Content sites should use progressive enhancement. Sites that are applications (web apps) are probably a better UX if they are client-side/single page apps.

I work in TV at the moment, building TV UI's as single-page JavaScript apps. This just would not be possible server-side, even if the server ran on the head-end.

Choose the right tool for the job, and most of these complaints fall away.

Of course, white listing JavaScript is 100% your prerogative. It's your browser and web experience and it's great that you're free to choose.


How would the author face new standards like Websockets and Server-Sent Events? Those standards requires some kind of client scripting.

As a previous user has stated, the use of javascript is not the problem. It's the abuse of it.


Not use them unless absolutely necessary.

And not use websites that have them unless absolutely necessary.


Yawn Seems like a pretty boring and uninteresting web.


JS in browsers is a blessing and a curse.

Running stuff on clients instead of servers minimizes server load, but all the junk that bad web-devs had loaded onto their servers now gets loaded to the clients.


The entire posts reads like an old man complaining about kids on his lawn.

The justification is just .. dumb. First because the debate is over. People want to build web apps, and they want to consume web apps. You can certainly stay on the side-lines, complain and yearn for the 90s and those glorious days of geocities and 'proper' web pages, but the world has moved on.

Second, there is this arbitrary expectation of what the web should be. Who says the people of 1995 got the web completely right?! It's like we should have just ended innovation back in the mid 90s because the pinnacle of perfection has been reached.

>As Jeremy points out, a web app seems to be nothing but a web site that requires JavaScript.

No, it's not. A web application is an application that runs in the browser (I don't want to debate web page vs web app). We had web applications before Javascript client-side became big, except everything was rendered on the server, and required reloads for most state changes. It was a largely a shitty experience. It was shitty for developers who had to write it, and shitty for users, who had to use it. But still, it conveyed huge benefits over corresponding installed applications. Your quintessential web-app back in the day was web-mail.

The original vision of HTML as only hyper-linked document markup probably lasted all of a year before we began abusing it to build things outside of the original scope.

>Perhaps we could go a step further and share the whitelists and blacklists.

Definitely within the ethos of the web.


  The justification is just .. dumb.  
Arrogant, unnecessary opening. Your reasons had better be good.

  First because the debate is over.
Clearly it isn't.

  People want to build web apps, and they want to consume
  web apps.
"People" don't want to build anything; it's devs who want to do that. People want to do all kinds of things on the web. Often, they want to spend as little time on your site as possible so they can get back to whatever they were doing before. They want consistency, predictability, familiarity. They want their browser to work the same on your site as it does any other site they visit.

We had the same debate with Flash. All designers loved it and couldn't help but do "cool" stuff with it. The general public mostly just complained that it was annoying and didn't work as expected.

  You can certainly stay on the side-lines, complain and
  yearn for the 90s and those glorious days of geocities
  and 'proper' web pages, but the world has moved on.
Hmm, were you blowing a raspberry as you wrote that

...I see the next point is more hand waving...

  >As Jeremy points out, a web app seems to be nothing but a
  web site that requires JavaScript.
  No, it's not. A web application is an application that runs
  in the browser
And this is why you don't get it. There is no binary here, it's a continuum. The kind of web app you're talking about is the kind of site that would require JS and the kind of site that the author would whitelist. Incidentally, it's also the kind of site I write for a living.

Those aren't the sites the author is talking about.


>"People" don't want to build anything; it's devs who want to do that.

+1 for pedantry. Apparently devs aren't people, they are "people" (or is it the other way around).

>People want to do all kinds of things on the web.

That's right. They want to do all kinds of things on it. They want to hang out on facebook, check gmail, keep track of twitter feed, write documents in google docs, and upload to flickr. All those "web sites" really stretch the original vision of what HTML was supposed to be, and what the author is arguing for. They are all JavaScript heavy not because Devs love JavaScript (it's a shitty language), but because it's the only way to provide a proper experience for the end-users.

>We had the same debate with Flash.

We didn't have the same 'debate', because there's no debate. End-users aren't complaining about the current state of the web. Devs aren't complaining either. There's no movement to push the web in the other direction. Users, developers and all major internet corporations are pretty much aligned. Most of the focus is on things like privacy, not on reducing the use of JavaScritpt.

>There is no binary here, it's a continuum

Actually I purposely used "application" because I wanted to reinforce the point that some things that run in the browser are applications, not pages, not marked up documents, applications. Furthermore those web applications may not necessarily require javascript - that's how they are written today, that's not how they were written yesterday. If you want to go into a debate as to what the difference between a 'web site' and a 'web app' is, I'd say there's a continuum there, but I focused on pure "web applications".

>The kind of web app you're talking about is the kind of site that would require JS and the kind of site that the author would whitelist.

Are you sure about that?


Yes. Try browsing the web without JS enabled for a while and see how many regular sites (not apps) just fail to work. And I don't mean a JS enhancement fails to work; I mean the whole site fails to work, or even load.


You are against pointless UI decisions spurred on by javascript. That doesn't mean all UI decisions are pointless. Instead of advocating getting rid of javascript, perhaps it is more sensible to advocate using javascript reasonably. I don't think the all or nothing mentality applies here.


Make something compelling. Build it non-crappily. Congratulations, you'll get on the whitelist.

Fill my browser with pseudo-popups, craptastic infinite scrolling and other awful things and you won't.


Seems somewhat like a double standard. You can't use JavaScript, until you can.

"Compelling." "Built non-crappily."


You can't drive a car. Until you can.


I used to be a NoScript user myself, but eventually I got tired of the constant whitelisting. Nowadays I just run a combination of AdBlock Plus + Ghostery with everything blocked by default. It cleans out the vast majority of gunk while still keeping things functional.


"JavaScript is becoming the new conduit for awfulness." - did not you hear? It's called "Open Web". It means you have no choice but to use a single hastily-designed script language. Or second-class-citizen transpilers.


This smells a bit like linkbait to me (and amusingly similar to some of the items in the 4chan parody not so long ago.)

I think this would be better as a blacklist. I definitely don't find as much abuse as is claimed, but it does happen occasionally.

I don't want to return to a world where I have to refresh my email window to receive new email, or a world where collaborative google docs aren't possible or the whole plethora of awesome stuff javascript has enabled - you can't get that useful stuff without it being abusable, that power can be used for good or bad.

I think the delays in loading sites is also overstated, yes on poorly designed sites, but e.g. my home page which is angular-based loads very quickly, and one of the benefit of doing things on the client-side is that you can cache more and only transmit the data the client-side app needs to use, dynamically.

Blacklisting, not whitelist solves this problem, my friend.


"Linkbait". I wrote a post on my own fucking blog on Friday. I was quite drunk at the time after a long week of working on web stuff.

Then someone else posted it on Hacker News.

Any chance I could write stuff on my personal blog in the future without being accused of clickbaiting or linkbaiting or whatever?


Actually - "This smells a bit like linkbait to me"

I don't think that quite qualifies as personally accusing you of intentionally posting linkbait to HN.

There are plenty of articles submitted here by people who aren't the author that are in fact intentionally linkbait. I don't know where else you advertised this post, etc. so you not having posted it doesn't mean it wasn't linkbait (I believe you that it wasn't, fine.)

So even though it was you venting after a[n inferred] tough week, then the submitter might have intended it as clickbait for karma purposes. So I didn't actually necessarily direct that comment at you.

Also keep in mind if you write on a public blog, there's always the possibility that people will express an opinion you find disagreeable somewhere about it, probably less politely elsewhere (especially in the flamebait-attracting area of programming languages.)

If I wrote a blog post, entirely for myself entitled 'why I despise Windows 7 and wish we could go back to the wonder days of ME', I wouldn't be surprised if it resulted in people suggesting it was linkbait if it was later posted to a news site (other than it was getting attention, of course :)

Having said all that I should apologise, I didn't mean it as a personal attack (though I do, respectfully, disagree with your article), as usual text is a dreadful medium for expressing these things.


I tried this once as well and I might have stuck with it if chrome had a quick "add to js whitelist" button. Instead you have to fiddle around in the settings menu which just takes way too long.


...and frankly this whole "user experience" thing has got completely out of hand. Just what is wrong with sending a stack of punch cards in by post for processing? I just love getting the box of listing paper back with a printed core dump to highlight those little slips in the code or JCL.


You interpreted my drunken rant about the poor user experience of bad client-side scripting as an argument against user experience? Okaaay.


You missed the point of my comment. My point was that you can't turn off an important (nay crucial) element of modern web applications simply because you don't like how some web sites go beyond enhancement into the black pit of defacement.

It has often been pointed out that technologies can be applied for good and bad. Simply ignoring them and hoping that the world will join you within your circle of wagons just will not work. You can't turn the clock back - you can (perhaps) demonstrate the benefit of a more appropriate implementation.

Maybe my contribution was a bit too opaque.


I can turn it off. And I have.

If the positives of JS use outweigh the negatives, congrats, you get on the whitelist. If you annoy the living shit out of me with ten jazillion share buttons, I'll exercise my right to not let you run arbitrary code written in a Turing complete language on my computer.

The world seems to have joined in with Adblock. Keep up the parade of annoying bloat and people will start doing the same with JS.


JavaScript (as it is used in practice) often results in worse "user experience": just as the author writes.


The more general problem is the danger of Turing-completeness: if you let your DSL become Turing-complete, you will have to use the Turing-completeness. HTML not being Turing-complete was a feature.

(I realise this ship sailed in 1995.)


Maybe I should go and look for a typewriter.


> It’s a cargo cult: people do it because everyone else is doing it.

And that's one of the core problems of our society. The seed is planted by commercial interests we cultivate in our culture (the rat race for more money than your peers) and it grows, because we don't choose cooperation (as in "open-source software") over competition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: