Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
FilterLists – directory of filter lists for ads, trackers, and annoyances (filterlists.com)
45 points by temp on Feb 20, 2016 | hide | past | favorite | 11 comments


We need whitelists. I'm using uBlocko to block all third-party requests and manually whitelist requests vital for the site such as those who load the CSS from a CDN. Many sites use their own CDN, with the domain name finishing in .net instead of .com or with whateverwebsite-cdn.com.

I believe a convenient web browsing experience is possible with about 5% of third-party requests granted. We need whitelists, not blacklists.


A plugin that allowed you to whitelist as you go would probably be pretty usable after a few days of training.


Perhaps, but do you necessarily want to whitelist the entire site and all the accompanying JavaScript, or just the top level domain and a few components to allow it to work correctly. Depends on the granularity you desire. That's where uMatrix would come in handy.


RequestPolicy Continued https://requestpolicycontinued.github.io/ does this, and comes with a subscription usability whitelist for CDNs and such.


Indeed. I'd imagine I could have it "trained" (to cover the 20 or so sites that make up the majority of my browsing) in just a couple of hours and it would be time well spent.

I'd even donate or pay for an extension that did this.


It's like http://someonewhocares.org/hosts/ on steroids.

I always wondered what the web would feel like, in terms of experience, without some manner of filtering. The last time I rode bareback on the internet without ADblockers, or even rudimentary hosts blocking was at an Airport kiosk stand, which even then felt weird, because the New York Times was still only learning about fingerprinting and grabbing what is effectively the Mac address of any machine using Flash.


be _very_ careful about copy-pasting an enormous hosts file with one found on the internet, especially one served over http. such a thing is ripe for phishing injection.

it takes one malicious entry in the list of 10k which doesnt loop back to your own machine for me to present you with a legit-looking and secure "capitolone.com" home page.


the solution is to use something like:

curl --silent http://someonewhocares.org/hosts/hosts | grep '^127.0.0.1' > /etc/hosts


Hmm, nice. The Malware Domain List should prove to be handy. I just wrote a script to transform it into unbound format so I can block them via my firewall's DNS.


Gist please?





Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: