We need whitelists. I'm using uBlocko to block all third-party requests and manually whitelist requests vital for the site such as those who load the CSS from a CDN. Many sites use their own CDN, with the domain name finishing in .net instead of .com or with whateverwebsite-cdn.com.
I believe a convenient web browsing experience is possible with about 5% of third-party requests granted. We need whitelists, not blacklists.
Perhaps, but do you necessarily want to whitelist the entire site and all the accompanying JavaScript, or just the top level domain and a few components to allow it to work correctly. Depends on the granularity you desire. That's where uMatrix would come in handy.
Indeed. I'd imagine I could have it "trained" (to cover the 20 or so sites that make up the majority of my browsing) in just a couple of hours and it would be time well spent.
I'd even donate or pay for an extension that did this.
I always wondered what the web would feel like, in terms of experience, without some manner of filtering. The last time I rode bareback on the internet without ADblockers, or even rudimentary hosts blocking was at an Airport kiosk stand, which even then felt weird, because the New York Times was still only learning about fingerprinting and grabbing what is effectively the Mac address of any machine using Flash.
be _very_ careful about copy-pasting an enormous hosts file with one found on the internet, especially one served over http. such a thing is ripe for phishing injection.
it takes one malicious entry in the list of 10k which doesnt loop back to your own machine for me to present you with a legit-looking and secure "capitolone.com" home page.
Hmm, nice. The Malware Domain List should prove to be handy. I just wrote a script to transform it into unbound format so I can block them via my firewall's DNS.
I believe a convenient web browsing experience is possible with about 5% of third-party requests granted. We need whitelists, not blacklists.