Reading this article does, in fact, increase my trust that my Facebook account won't be randomly, irrevocably banned one day a la google.
The trouble is, that's not my primarily distrusted thing about facebook; I don't trust that the power they have to shape people's opinions by deciding what to show them, won't be abused to make people think things that are good for facebook but bad for society at large.
So while that article does increase my trust in facebook in general, the magnitude of that increase is miniscule, because what it addresses is not the reason for lack of trust.
But you're right that transparency wouldn't solve that. Because it's only the first step. If facebook were to transparently say "we are promoting far right conspiracy theories because it makes us more money", and provide a database of exactly which things they were boosting, while perhaps I would "trust" them, I certainly wouldn't "like" them.
The trouble is, that's not my primarily distrusted thing about facebook; I don't trust that the power they have to shape people's opinions by deciding what to show them, won't be abused to make people think things that are good for facebook but bad for society at large.
So while that article does increase my trust in facebook in general, the magnitude of that increase is miniscule, because what it addresses is not the reason for lack of trust.
But you're right that transparency wouldn't solve that. Because it's only the first step. If facebook were to transparently say "we are promoting far right conspiracy theories because it makes us more money", and provide a database of exactly which things they were boosting, while perhaps I would "trust" them, I certainly wouldn't "like" them.