Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ford might manufacture a car driven by a serial drunk driver. Perhaps they need to install breathalyzers in all their cars, by default.

I'm not really sure how much more ripe for abuse Omegle was compared to, say, Discord. Pretty much any video chat service can be abused to send or receive illegal content, and to abuse and manipulate other people. These are risks inherent to anything enabling communication. Short of a panopticon where all communications are manually approved by a human moderator, there's no sure way to prevent abuse (and even then human moderators are fallible).

There ought to be some reasonable attempts to mitigate abuse, like a reporting functionality. But beyond that I don't see much more Omegle could have reasonably done.



Again the pointless and frankly silly comparison to cars… they’re categorically unrelated product classes: Ford‘s cars didn’t intentionally, as a feature of the vehicle, randomly put you into head on collision situations with others, which is essentially what Omegle did by design, while also NOT (in any sense) being fruitfully comparable to a manufacturer of vehicles.

Same with the equally pointless comparison to Discord… Omegle wasn’t merely a video chat service, it made random matches that the user could narrow by identifying their own interests; an adult male user identifying as being deeply interested in things only children would be interested in could readily and easily (and obviously) weaponize the platform, and Omegle absolutely could have (and should have) used the many and obvious means available for profiling and identifying such incongruous users, which (sure) would include human moderators.

There’s an enormous ethical difference between not doing anything whatsoever to prevent abuse and perfectly preventing abuse, and you seem to think they had no obligation to prevent any because they couldn’t prevent all… they don’t exist any more (thankfully) because lawyers started to (correctly) point out that that isn’t how either ethics or tort law work.


Omegle didn't intentionally put people on a collision course with abusers either. If Omegle was intentionally facilitating abuse as you put it, then so is IRC and effectively any other public communications mechanism: because anyone could be an abuser.

Even just preventing 1% of abuse would probably have been beyond the capabilities of this site. You write that they should flag adult men listing interest in topics associated with children. How are they supposed to identify the gender and age of users? People under 18 are prohibited from the site, yet that clearly failed. Human moderation can't even monitor a fraction of one percent The "many obvious" ways of preventing abuse were in fact attempted [1]:

> Omegle implemented a "monitored" video chat, to monitor misbehavior and protect people under the age of 18 from potentially harmful content, including nudity or sexual content. However, the monitoring is not very effective, and users can often skirt around bans.

Sure, Omegle "randomly put you into head on collision situations with others", but so is every other public communications: IRC, discord, Xbox Live, pretty much anywhere you can meet random people on the Internet fits into this category.

1. https://en.m.wikipedia.org/wiki/Omegle




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: