Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's no need to sync every time a new phishing URL is added - only every time a URL is visited by a client.

The delta can be derived just from the version number of the client's URL database, and should be a total of 1 MB in size for a whole day's worth of updates. So ~1 MB for the 1st URL visited in a day, and considerably less afterwards. Compared to average webpage size, that's nothing.

Really, only thing that changes is instead of sending a URL hash, you send the URL DB version, and the reply is the list of changes since that version.



> Really, only thing that changes is instead of sending a URL hash, you send the URL DB version, and the reply is the list of changes since that version.

Or none at all and a simple confirmation that the list is up to date. Yes this is a way better idea.

Although it's always going to be less efficient. For instance i'm not sure how it would scale into the future. Checking URLs server side is optimal, it's always going to be relatively constant in proportion to the URL size, but with DB deltas each URL is now related to both the URL size and the DB update frequency, i.e as the malicious URL rate increases over time, individual URL lookups will incur greater network cost... this is probably not a big deal for the client, but It would make a significant difference for the provider of the deltas - or maybe network caching would disolve it again? I mean there would be a lot of duplicate deltas flying around every minute... basically a content distribution problem but with a high frequency twist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: