Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is nothing preventing an open standard moderation tag that each user selects how to treat.

With such moderation tagging in place, for example, if I wanted to use Reddit but mistrust moderators, with an open treatment of data, I would be able to select to ignore their rating/per entry and use ratings by a group of users I select. (impossible to imagine Reddit doing something like this)

However, it requires I have access to core data. Today, the superusers or platform admins can prevent "controversial" entries from even being visible on the platform/subplatform/subreddit.



This seems to misunderstand the purpose of moderation entirely.

Moderation serves many purposes, but only one of them is to save your delicate eyes from seeing things you don't want to see. Perhaps the more important one are things like: ensuring that your social media service doesn't have a bad reputation so that people other than disaffected white guys will actually want to use it.

If your site is overrun with porn, then sure, each user could just block those millions of posters with your open standard... but every new user still sees it, and decides "this service isn't built for me" and never comes back.

At the core of community moderation is community. If the community looks like it's full of people doing gray-illegal things, then normal people (and advertisers) will generally shy away... and lawsuits will not be far behind.


This seems to be easily solvable by just making moderation an opt-out default. HN itself has the "showdead" setting, which defaults to false. Anyone who forms their opinion on a community after deliberately turning off moderation is either a moron or trying to intentionally discredit that community.


It doesn't work. Moderation has it's name precisely because suppressing content that is crated to incite emotional reactions keeps the mood in a community. All free-speech platforms I know of devolve into a hotbed of Neonazis and trolls. As a consequence, I am no longer interested in trying out "censorship resistant" platforms because I precisely know what will happen a couple months down the road.

Social media without meaningful moderation (ie. FB and Twitter before the latest crackdown) was an error.


I wasn't talking about a "free speech platform", I just think that moderation should be transparent, and that users should be able to view content deleted/hidden by moderators if they wish.

Many times I've read a discussion about someone being fired for a social media post. Some people will defend them, some people will say the reaction was justified. And I can't make up my own mind, because that post was swiftly deleted by moderators and now everyone's just acting on their own memory of the post.


Which is why defaults are important. You have your default filters filter out the stuff the mods dislike.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: