Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The weird thing is how selective HN is about duplicate articles. I've submitted things just to find out they were originally posted up to 3 years ago after it redirects me to the submission. A PG essay can be resubmitted, same URL and title, without issue at least several times (as the HN search link shows).

I'd love to know how HN goes about deciding which content is OK to resubmit over and over again.



It keeps track of the sites in memory. That means if the machine gets kicked, old urls can be submitted again.

It is also just simple URL based limitations, adding some bogus parameters to a url, or a hash, would be enough to get around it.

I don't know the specifics of the algo, but it may also add sites to its cache if someone visits one of the old comments pages, which could make it look more sporadic and picky.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: