> ...and PornHub did everything feasible to try and remove and report it.
That's demonstrably not true, since PornHub did do more in response to the mainstream media coverage (e.g. finally require some kind of age verification for uploaders and purge unverified content).
The only way they could be contextualized to have done "everything feasible" is within a fundamentally flawed model that they created, but it's precisely that model that needed to be reformed.
I stand by what I actually wrote: that they did everything feasible to remove and report illegal content. Having more extensive verification of uploaders does not an expanded effort to remove and report illegal content that gets posted, but rather an effort to prevent said content from being uploaded in the first place. And as you pointed out, this comes with tradeoffs in the form of a more onerous sign up process.
To be even more pedantic, they still haven't done everything they can do: they could shut down their whole site and eliminate 100% of illegal material with 100% confidence. Which, I suspect, is the goal of those propping up the narrative that PornHub was some sort of wild west where child pornography was welcome.
> I stand by what I actually wrote: that they did everything feasible to remove and report illegal content.
I mean they didn't even do that. Wasn't their enforcement team pretty small? The original op-ed said it was ~80 people. In any case it was inadequate.
> but rather an effort to prevent said content from being uploaded in the first place.
PornHub chose a model that made it impossible for them to deal with their illegal content problem. The distinction between legal an illegal porn is often too subtle for any solution that just looks at the content or relies on "someone else" to do the work for them. That's the core issue. It's like a factory that dumps toxic waste into a river, and then "solves" that problem by building a little filtration plant far downstream of the factory that only filters a fraction of the river water, just upstream of some city. Their solution can't work, for reasons that should be obvious. The only solution that has any chance of working is filtering the waste before it goes into the river at all, and that's essentially what PornHub has started to do with their more-thorough vetting process.
Businesses don't have the right to make compliance optional or inadequate if it doesn't work for their business model. They have to pick a business model that can be compliant.
For the second time, out of the tens of millions of videos on the platform critics found illegal content numbering in the dozens. Their enforcement was sufficient to drive down illegal content to literally one in a million rate of occurrence. Weighted by number of views on videos, it's probably an even smaller fraction of that. The filter was sufficient to make illegal content something that the overwhelming majority of people - almost everyone - will never see when using their platform.
By comparison, critics described pornhub as:
> Human beings of all ages,
races, genders, and sexualities are being abused while Pornhub pockets profits from
selling said abuse and exploitation online. It's nearly impossible to stress strongly enough the fact that these cases are far from anomalies.
> For the second time, out of the tens of millions of videos on the platform critics found illegal content numbering in the dozens.
Can you say that was all of it? Frankly, I don't see how anyone can have any confidence that PornHub's previous moderation practices were effective. Some of the stuff they have to remove is too hard to detect without context which is not present in the content itself. Also, those practices put the onus on the wrong party (e.g. forcing someone who had illegal or otherwise improper videos of themselves uploaded to find them and play whack-a-mole as they got reuploaded).
> The filter was sufficient to make illegal content something that the overwhelming majority of people - almost everyone - will never see when using their platform.
And now they have an even more effective filter.
And you're twisting the goalposts: "the overwhelming majority of people" aren't going to seek out illegal content, so talking about what the "majority sees" is actually ignoring the problem.
>Businesses don't have the right to make compliance optional or inadequate if it doesn't work for their business model. They have to pick a business model that can be compliant.
Not to be snarky, but isn't the the SV business model. Facebook 'are not a publisher', because editors cost too much. Uber are 'not an employer', because proper benefits cost too much..etc ad infinitum
> Not to be snarky, but isn't [that] the SV business model. Facebook 'are not a publisher', because editors cost too much. Uber are 'not an employer', because proper benefits cost too much..etc ad infinitum
I totally agree with you, it is. Responsibility is a barrier to scaling and other selfish goals, so their "clever hack" is to try to be as irresponsible as they can get away with.
That's demonstrably not true, since PornHub did do more in response to the mainstream media coverage (e.g. finally require some kind of age verification for uploaders and purge unverified content).
The only way they could be contextualized to have done "everything feasible" is within a fundamentally flawed model that they created, but it's precisely that model that needed to be reformed.