I am not sure if we disagree or if you are fundamentally misunderstanding the proposed solution.
There is nothing preventing using both methods simultaneously. Your approach is to teach people how to act. My approach is to punish people who break the rules. They are not mutually exclusive techniques.
A) Unless I completely misunderstood your article, you are suggesting the admins put out links to things they do not approve of to see who screws up and upvotes them. In other words, you are suggesting that admins basically model what not to do by doing it themselves. This is precisely one of my points: Don't do yourself what you don't want others to do. Lead by example.
B) There is a time and place for "punishment" but it should be a last resort, not a first line of defense. It fosters an uncivilized environment and is therefore counterproductive to solving the issues people here most strongly express concerns about.
So I can't say I agree with your assertion that they are not mutually exclusive techniques. They mostly are in my experience.
> Unless I completely misunderstood your article, you are suggesting the admins put out links to things they do not approve of to see who screws up and upvotes them.
Great! We now know that you did not completely understand my article. :)
The whole point of implicit honeypots is to leverage the fact that articles are already making it to the front page that violate guidelines (e.g., politics, religion, etc). The admins can then flag these articles, so as to not have to spam their own site.
I have to say, I tend to partially agree with both of you: admins shouldn't be submitting articles that won't add to the discussion, so I would drop the first part of your proposal.
For submissions that have already made it into the site and are detected to be honeypots, those votes and flags would be used to punish users.
As a new user, I can say that what attracted me to the site is its quality and the number of links that lead me to the edges of my understanding of a given topic... I feel pushed to learn more.
At the same time, I would like to contribute and also feel compelled to express my opinion at times, but don't really feel free to do so unless it is an area where I feel pretty confident that I either know what I am talking about or have something to say that wasn't already said. I've noticed that just saying "wow that's cool" is frowned upon. I've also seen several threads where comments are downvoted to invisibility and I can't figure out why. Sometimes later they are upvoted again, sometimes not. But I feel like I am learning what is and is not acceptable... and as I increase my knowledge on topics that I came here to learn about, I hope to have more insights to offer back to the community (right now I can't say that I do).
I guess my points here are:
A) It's already a great site with quality much higher than a lot of other message boards.
B) It's hard enough to figure out what is ok and not ok to comment on.
C) To keep the community vibrant, presumably there should be some tolerance for and encouragement of growth in posters' ability to contribute.
This talk of "punishing" is discouraging. I suppose if there are already enough people here to understand what the community is supposed to be about, and if that group is self-sustaining, then there is no need to worry about attracting new users and exclusionary tactics are not a problem.
Quality is what you (or we) make of it. I've read thought-provoking comments on topics that are probably a bit off the reservation... and seen interesting segues inside threads that take me places I wouldn't expect.
Another approach might be to seed the front page with articles that are good examples of what the community is striving to focus on. Maybe put a green sprout next to it or something. Add one more voting mechanism for people at whatever karma threshold: a vote for "exemplary" status. I suspect that not every regular upvote would translate into an "exemplary" upvote... the front page would reflect the interests of the community, and if it was bare of exemplary articles, I have no doubt users would soon vote some quality articles onto it. My own preference in dealing with people is to give them an easily accessible mechanism to exceed your expectations instead of finding ways to punish them for not.
The chill effect very seriously concerns me. Assumptions of guilt do enormous harm to trust and undermine genuine civility. People need to feel it's reasonably safe to open their mouths and they need to feel they don't have to walk on eggshells or be perfect, that there is some room for being human, making mistakes, and so on. Robust discussion cannot thrive without some tolerance for friction. Finding ways to lubricate the process is good. This proposed approach is not lubrication.
I guess the message that this sort of moderation (whether done by human or by algorithm) sends is "don't come here unless you already fully understand and appreciate the ethos and mission of this site, and don't post unless your contribution is going to be something of the highest quality possible, according to the standards of the site".
Which is fine as far as it goes, but basically when you boil that down it's "don't screw up, or else."
That isn't what attracted me here. What attracted me here was reading interesting links and thought-provoking discussion, and thinking "man, I need to up my game so I can participate meaningfully".
If the goal is to have a members-only kind of retreat from the mundane, then I suppose the notion of creating an underclass of posters who don't even know they are being ignored makes sense. But in that case, why not take it a step further and just require applications and screen out members in the first place?
If the goal is to grow the site and generate more traffic, then I would submit that encouraging people to emulate quality contributors is a better approach... why not flip this algorithm on its head. Instead of hell-banning those who score poorly, add in a karma boost for those who score optimally... and an indicator on articles that meet the site criteria for quality.
People don't like to do as they're told, but they sure like to do what got somebody else a gold star.
What attracted me here was reading interesting links and thought-provoking discussion, and thinking "man, I need to up my game so I can participate meaningfully".
I participate to up my game. This approach tends to kill that possibility (or at least contribute to slowly killing it).
If the goal is to grow the site and generate more traffic,
As I understand it, the actual business goal of the site is to help YC screen applicants: Your user-name is a required part of your application to YC and (if no one else) PG will go check your comments. Since start-up founders tend to be young and therefore probably a bit socially wet behind the ears, it seems to me that being too controlling about the site in that regard is potentially a bad business decision.
Can be performed simultaneously, but it is fairly common belief from psychology (mostly from Skinner) that positive reinforcement is better than punishment when it comes to altering behavior.
Sure. I totally get that and think it's a great area for future exploration.
A couple of points though:
1) You are not technically giving any reinforcement, because the agent in question (i.e., the user) does not perceive any change in the environment.
2) The possible pageviews that HN can drive may offer sufficiently positive reinforcement for people to continue violating the site by creating link-bait articles.
3) Honeypots are merely a fail-safe to prevent degradation. In a healthy community, one would expect that very few people would ever actually have a sufficiently low h-ratio as to be ignored.
There is nothing preventing using both methods simultaneously. Your approach is to teach people how to act. My approach is to punish people who break the rules. They are not mutually exclusive techniques.