Less than a year old. HN really needs to alert you to dupes when you submit . . or does it? I know it doesn't allow you to post a dupe for a cool off period after it has been posted . . .
There is a dupe checker, but only for exact matches (which if there is one instead of getting a warning it automatically lodges an upvote of that link on your behalf). The OP of this thread appended a # to the end of the URL to bypass the filter.
Thank you, although not everyone agrees. Although sometimes they then get re-upvoted, many of the duplication notifications get downvotes. Here are two recent ones:
then HN needs to implement URL normalization and equivalence from RFC3986. These are fresh in my mind because I extended urllib in Python to support them last week.
It isn't an easy problem to solve, working out if two URLs are equivalent, but the RFC goes some way to solving it which picks up easy things like adding a #.
Thanks to hashbangs (or really just the prevalence of Ajax-only sites), it's now a lot harder to tell whether two pages with #something in the URL are effectively the same.
No cool off period (although that is a good idea). I just submitted a good read and it head already been posted 616 days ago, and thus no one will see it.