I propose a solution with the following characteristics:
- New users can easily post valuable comments.
- Trolls will not have the opportunity to annoy the general user base until they have put forth significant manual effort.
- The system is difficult to game.
To implement the solution, Scoop will need to assign one of three statuses to each user.
- Probationary
- Normal
- Trusted
All new users start as probationary. When a probationary user posts, her comment will be visible to herself and a random subset of trusted users. Once four trusted users rate the comment, the system will decide if the comment is good enough to post. All users may optionally see, but not rate, probationary comments.
After the user develops a history of contributing valuable content, she will graduate from probation and become a Normal user. If she develops a history of posting worthless content, her account will close. For the sake of completeness, I suggest 5 hidden comments = closed account, and 20 posted comments = Normal user status. Naturally, these numbers are open for debate.
Normal users can post without any pre-approval.
The system will grant Trusted status to users with a history of posting particularly good material.
Quick ratings for probationary users
Most comments never receive four ratings, so how can we ensure that probationary users will ever be able to post anything?
I suggest that all probationary comments appear at the top of the comment list, highlighted with a request to rate. Any trusted user who is able to see the comment will know to give it a rating. I also recommend that a trusted user never see and/or rate more than one probationary comment per story.
Picking trusted users
If we allow any trusted user to rate a comment, trolls could acquire four accounts and game the system. Instead, when a probationary user posts a comment, Scoop will select a random subset of trusted users to see and rate the comment.
When a new comment is submitted, Scoop should pick a random number, x, such that 0<=x<N. When a trusted user opens a story with a probationary comment, the system will take the trusted user's userid and compute the MD5 hash (with a random salt value for each story). If the hash modulo N = x, the user will see the probationary comment.
The formula uses a random number x which is set in stone the moment the comment is submitted. It uses N which the site administrators set and keep static over a long period of time. It uses userids which are static. And it uses a salt value which remains static for each story. Trolls can not reload the page hoping to rate their own comment. To game the system, a troll (or set of trolls) will need approximately 4*N trusted accounts. Any trolls who game the system can be quickly identified and their many hard-earned trusted accounts revoked.
Identifying and eliminating trolls
While a single site administrator can identify and eliminate most trolls on a case-by-case basis, I would like to expand on Arkaein's proposal to let the community rid itself of trolls. Granted, this is not absolutely necessary but I thought it was an interesting idea.
Rather than burdening the site administrator with ferreting out trolls Arkaein proposes that the users put alleged trolls on trial. Any trusted user that spots a troll can "charge" the user with violating site policy. The prosecutor (the user whose pointing their finger) will fill out a form. They will pick the violation from a drop-down list. They will paste URLs to the stories, comments, or comment-ratings that they feel violates site policy. Then they will write a description of the alleged violation.
An alleged suspect will lose the privilege of posting and rating until they agree to a trial (the user may post a response to the allegations in the mean time). If they don't agree to a trial, the system will automatically start one in 72 hours.
Once the trial begins, Scoop will pick a random number y, such that 0<=y<M, where M is one or two orders of magnitude larger than N. The system will then pick trusted users in a fashion similar to the probation process. These trusted user will be potential jurors (actual jurors will be on a first-come, first-serve basis).
Each potential juror will have a comment at the top of each page notifying them that they have been selected for jury duty. They will read the allegations (the infringing comments should show up inline) along with the defendant's response. They will then vote:
- Abstain
- Not Guilty - the suspect did nothing wrong
- Not Guilty - I disagree with the suspect, but debate must tolerate dissent
- Guilty - Give the suspect a warning
- Guilty - Lose trusted status (only available if the user is trusted)
- Guilty - Put on probation
- Guilty - Close account
Once a predetermined number of jurors have cast their votes (not counting those who abstain), the system will determine the result. Scoop will count how many people voted for the most severe option, then the second most severe option, and so on. It will stop once it counts half the votes. Whichever item the system stops on will become the official outcome of the trial.
The result of all trials should be available for everyone to see.
Restrictions on bringing charges
A trusted user may only bring charges against one user at a time. If their suspect is acquitted by a large margin, they will lose the right to bring charges.
The system will record if a particular comment, story, etc. resulted in charges. If it did, the system will not allow charges for that comment a second time. If a comment is more than 48 hours old, it obviously wasn't bad enough to merit charges, so don't allow it.
I haven't spent a whole lot of time analyzing the trial perspective. Perhaps normal users should also be potential jurors (jury of peers). Perhaps two or three people should have to bring charges for a single comment before a trial can begin. Perhaps the trial isn't even a good idea.
Summary
Under the proposed system, new users can post at any time. Any valuable contribution will be quickly rated up. Any attempt at trolling will quickly disappear.
Trolls will have to contribute 20 valuable comments to the site before they can annoy most users. If trolls lose their account after the first public display of immaturity, the signal to noise ratio will be 20/1=20.
Furthermore, trolls can annoy at most 1/N trusted users with each probationary comment. And trusted users will see only 1 out of every N worthless probationary comments.
To game the system, trolls will need to acquire 4*N trusted accounts. They will lose their hard-earned trusted accounts quickly after abusing them.
The system significantly raises the bar to resume trolling, without significantly hindering new users.