What strikes me about this issue as a whole is what it says about the true state of "AI". This is a perfect job for such technologies. I mean how is it that we're already making 'deep fake' videos and audio but can't feed a video stream which is just a stream of images to an algorithm which can determine if it's inappropriate. I recognize that some such tech is being utilized on the front end in this case, and that the problem is non trivial, but I see this as FB saying 'good enough' and not pushing as hard as they could to improve the tech to where it can be trusted to make the decision. I sense that they may be telling themselves they're doing social good by 'creating jobs'. Why must humans be subjected to this torture? What happened to "move fast and break things"? Why not put the algorithms out front and let them have the final say, and let them learn and improve quickly? I suppose just because meat is cheaper than chips.
> I recognize that some such tech is being utilized on the front end in this case, and that the problem is non trivial, but I see this as FB saying 'good enough' and not pushing as hard as they could to improve the tech to where it can be trusted to make the decision. I sense that they may be telling themselves they're doing social good by 'creating jobs'.
This seems like a weird take to me. Why would this be your conclusion rather than that the technology isn't good enough yet?
Because the move to hire so many moderators so fast was a big expensive move that seemed like more of an implementation for PR purposes considering some of the troubles the company was experiencing. When your motto is 'move fast and break things', and you implement features like live video streaming without much thought to the full ramifications of doing so, it would seem even if only to me, that company might not be afraid to take a leap on tech that's 'not quite ready'. Again, I know such a hasty implementation could hurt quality, but given what's at stake (human health), they might get credit for doing the right thing. Obviously, though in our society, where they wouldn't get such credit, and would only get flamed for having a bad user experience for the wrong videos being taken down, the mental health if a few humans who willingly signed up is a small sacrifice for profitability.