After the Washington Post’s website decided to shut down comments on its Post.Blog, I asked our own budding MediaShift community how you thought blog comments should be moderated here and elsewhere. Should online forums and blogs put up technological roadblocks to spammers and people spewing vitriol? Should they employ humans to check every comment before publishing them?
Your responses were thoughtful and varied, and I can tell this is a subject that some of you have first-hand experience with. But before I get to all your great commentary on comments, let’s get the take from the man inside the maelstrom, Jim Brady, editor of Washingtonpost.com (pictured above). He emailed me his response to my more general question about how much oversight online forums and blogs require:
I think the answer depends on what kind of community you want to set up. If you’re a blogger, and you don’t want to put any restrictions, obviously they don’t need any oversight. But if you decide that there are certain things you’re not going to tolerate, then they need a fair amount [of oversight]. You sort of make a deal with your readers when you open these forums that the rules are not there for show, and that we will enforce them. There are a lot of people who want to be in communities that are debating the issues of the day and who want those debates to be adversarial but respectful, so you have to be willing to put the required resources toward enforcement. We got caught a little short in that area with Post.Blog.
I also think that the oversight you need is a lot greater right after you launch because all the enforcement is being handled by your own people. But, over time, I think the answer is to harness community members themselves to help keep things within the boundaries you set. That’s a model we want to explore once we get comments back up on Post.Blog. Also, beyond the responsibilities of patrolling forums, I also think site staffers need to be involved in moving the discussions along. In our best forums, Post or post.com folks are active in the comments area.
Vince Veselosky, who blogs at Mindvessel Media, echoed what Brady said about how the moderation depends on your community and your goals.
“How you handle comments is going to depend on the type of content you post and the type of community you want to build,” he wrote. “The more open and inclusive the community, the more tolerant it will have to be of trolls and spams, and more importantly opinions they disagree with. For some sites, moderation may be considered a necessity to control the signal-to-noise ratio of the comments. For other sites, the noise is half the fun. I find comments work best when there is a self-policing capability. Community members know what belongs and what doesn’t in most cases. But again, you have to be careful to avoid the echo chamber effect.”
The community-moderated option is a nice one, though that does require people willing to volunteer their time, or the money it takes to pay them. Or in the case of a site such as Slashdot and its discussions of news, people’s comments are rated by the community thanks to great filtering software.
In the particulars of moderating, your ideas varied from automated tech solutions to human ones. Heidi Nordberg, who blogs at VirusHead, said she believed that human oversight was crucial to block all spam and people who practice drive-by abuse — the old curse-and-run.
“The other issue is that the owner of the domain has to be the decision-maker with regard to what can be published on their site,” she wrote. “They have every right, and a certain obligation, to maintain an appropriate level of discussion. Sometimes the line is blurry, and rather than censoring someone’s comments, I have tried to use the comments as an entry point to further discussion. There have been a couple of cases where I felt I needed to put an end to an escalation.”
Some possible technical solutions include forcing people to register or give a valid email address, though of course, the determined ones will always get through somehow. Also, there are obscenity-blockers that will not accept comments with certain banned words. Peter Childs wrote that forcing registration might restrict the flow of comments, but might also raise the level of discourse.
Even my dad, Len Glaser, got in on the discussion, saying he favored a moderate use of technical and human filtering. “Too much human filtering is expensive in human time and cost,” he wrote. (But maybe he was just worried about the human time and cost for his son!) Mike Liveright, who runs the WeMatter.com virtual town hall, had a whole slew of suggested solutions, from a hierarchical threaded structure for comments to ratings and tags, and even feedback notification — so you would know when someone commented on your comment.
Finally, there came a couple warnings. One blogger known as SourMonkey said that filtering public commentary can be seen as censorship in some circles — which is a charge that has been made about Washingtonpost.com closing its comments.
“I understand the potential necessity for ‘filtering’ public commentary, however we should always be mindful of how censorship restricts cultural, and even ontological, knowledge,” SourMonkey wrote.
But on the other side of the coin, if I don’t prepare for a possible onslaught of harsh comments that break all our rules, I might have to shut comments as well. That’s the take of Paul Lukasiak, who says that if I can’t respond appropriately to this type of attack, then I might as well forego comments now.
“Because the time will come when you will be ‘blogstormed’ because of something that PBS did,” Lukasiak wrote, “and you won’t be able to handle it, and (like Jim Brady) you will get all defensive and blame the people who are rightfully, and righteously, angry, rather than blaming yourself for putting yourself in this position.”
In the spirit of open discussion, I’d have to say that Brady is taking much of the blame in this case, and I, too, would take the blame if the controls we put in place don’t work out. But you are all of course allowed to disagree as vehemently as you like in the comments below — as long as you don’t break the rules.
UPDATE (2/6): Jim Brady told me via email that the comments on Post.Blog should be back open in the next two weeks, though he doesn’t have an exact date in mind. He said they’re working on “better technology to automatically block profanity, and some work to tie registration to comments posting. Also, we are working on a better plan for situations where we get flooded with posts…”
View Comments (3)
Moderating is certainly challenging. We try to keep things as wide open as possible on our site, because we want people to be able to see their commetns immediately go live, and keep the conversation moving, rather than have someone take the time to proactively moderate every comment. There are filters in place to automatically moderate or junk posts containing certain types of links or words, but this is really used to prevent spam rather than censor opinions we don't happen to agree with.
If we see something we don't agree with, we'll ask the commenter to explain their position/support their argument. Who knows: they may be right!
ooo ... you really are understanding the essence man!
thanks.
BUY AMBIEN
I run a website about Cairns and have given up having comments on the blog because of spam