Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is part of a larger pattern for Facebook and other "large scale" advertisers where it's clear they are breaking laws at some rate and are protected from paying a real consequence by the cost of discovering the exact rate of misbehavior. Facebook recently settled a lawsuit related to racially discriminating in housing ads (a federal crime)[1]. I suspect this will get added to the pile of illegal services Facebook provides but argues it shouldn't be held responsible for providing.

This is, I think, the real moral hazard you saw back in the 2007 financial crisis: companies can reach a scale where it's very costly to definitively assess who is to blame for crimes and can therefor commit any profitable crime up to a certain threshold. It both makes a mockery of the rule of law as a concept (along with many other things in the US legal system) and is an enormous competitive advantage for large companies. I'd include Uber's grey area stalking[2] and the eBay stalking campaign[3] in this category.

[1] https://www.cnbc.com/2022/06/21/doj-settles-with-facebook-ov...

[2] https://www.theverge.com/2016/7/10/12127638/uber-ergo-invest...

[3] https://www.cbsnews.com/news/feds-charge-ebay-stalking-scand...



The flaw here is in expecting Facebook to be the police.

Here's what happens if you make Facebook do it. They become aware of someone running an illegal ad and ban them. But the advertiser is a criminal and still wants to make money, so they make a new account or find some way to avoid detection, and the ad ends up back in the system, and the advertiser can do this forever because they're making a net profit by continuing to play the cat and mouse game.

Here's what happens if you have law enforcement responsible for enforcing the law. Someone runs an illegal ad on Facebook, law enforcement subpoenas Facebook to find out who it was, and the advertiser gets arrested. This provides an actual deterrent because there is a penalty in excess of a ban, so it exceeds the profits from the criminal activity, and criminals dumb enough to not be deterred go to jail where they can't run any more ads.

Why are we trying to do it the stupid way and expect a non-stupid result? Corporations are not law enforcement. Stop trying to make it otherwise.


I fully agree it's a hard problem, but Facebook is the one telling us they can be the police. If you really feel you can't operate a business without doing crimes, you shouldn't operate the business. It's understandable - there are some businesses that it takes too much skill to operate without doing a crime and so people don't (there are lots of things that banks won't do that fall into this category). Facebook, by operating in this area, is saying they think they can do it.

> Someone runs an illegal ad on Facebook

The problem, to me, is that everything you said applies down stream too. You can't be sure the person purchasing the ad is trying to commit a crime either (perhaps they were hacked, perhaps they were dumb, etc). If you are promoting the "this is complex, actually" view - it's complex at every level.

The problem for society is that Facebook, as the company offering the service that occasionally breaks the law, is in a nice position. They get to profit off law breaking every once in a while and, as you say, it's hard to feel like it is possible to offer this service in such a way to perfectly avoid doing all crimes. So you get into this situation where Facebook (and everyone else at that scale) can do some crimes, but not so many that it's a big part of their business. It seems bad.


> can do some crimes

Simple solution: Upon their inability to abide by the law, downright outlaw them and shut them down at the second, known, provable count.-

Zuckbook is a known evil, making billions off of depressed vulnerable population's attention, and causing untold amounts of psychological damage.-

The least they could positively do is police their system, to the ninth percentile or better.-

Or be shut down.-

The next incumbent will figure it out, rather fast.-


> If you really feel you can't operate a business without doing crimes, you shouldn't operate the business.

It isn't Facebook doing crimes, they're a company whose customers are doing crimes.

> The problem, to me, is that everything you said applies down stream too. You can't be sure the person purchasing the ad is trying to commit a crime either (perhaps they were hacked, perhaps they were dumb, etc).

Which is why we have investigators and courts, to sort this out. When the police execute the warrant and find the drugs, the person's claim that they were hacked is not going to hold a lot of water. Whereas if they find no drugs but seize the computer and find malware on it, then they can investigate the malware network and find the actual perpetrators.

This is law enforcement's job. To figure out who actually did the crime and charge them with it. Facebook can't do that and shouldn't be expected to.

> So you get into this situation where Facebook (and everyone else at that scale) can do some crimes, but not so many that it's a big part of their business. It seems bad.

Why is it bad? Why is it even expected to be bad? It's true of every business whatsoever. A major hardware store that sells duct tape will have as customers some number of kidnappers who use it to tape the mouths of their victims. The kidnappers will go to a gas station and buy gas. These companies are thereby profiting in some small way from crime. But so what? Go arrest the kidnappers, the hardware store is irrelevant.


> It isn't Facebook doing crimes, they're a company whose customers are doing crimes.

Aiding and abetting is a thing.


Aiding and abetting requires intent. If I am a bus driver and a bank robber happens to ride on my bus on his way to rob a bank I am not aiding and abetting because I had no intent or knowledge of the crime.

> To convict as a principal of aiding and abetting the commission of a crime, a jury must find beyond a reasonable doubt that the defendant knowingly and intentionally aided and abetted the principal(s) in each essential element of the crime

https://www.justice.gov/archives/jm/criminal-resource-manual...


We are talking about a bank robber with the cartoon outfit with big bags with dollar signs on them boarding the bus to and from the bank repeatedly. But he pays for the ticket so I guess it is fine in this proto-dystopian "late stage capitalism".


> We are talking about a bank robber with the cartoon outfit with big bags with dollar signs on them boarding the bus to and from the bank repeatedly.

Okay, let's proceed with your cartoon example.

The bank robber goes to the ticket machine, puts in money, gets a subway ticket, swipes the ticket through the turnstile and rides the subway. If the subway operator posted guards at all the entrances to the subway they could see the guy, but they don't, because that would be crazy expensive when they're not the police and their concern is just to make sure people pay the fare, and they can do that by installing automated floor-to-ceiling turnstiles that block entry to the subway unless you pay the fare.

Why is it the subway operator's obligation to investigate this crime, instead of the police? The subway operator could investigate it, the crime is happening in public view, but so could anyone else. Moreover, we don't want random common carriers to be denying service to innocent people based on scant evidence just for CYA purposes. We want penalties to be handed out in court once the prosecution has met their burden of proving the crime beyond a reasonable doubt.


> They get to profit off law breaking every once in a while

Make Facebook liable and required to forfeit money earned from the criminal activity.


This is technically already the situation and you can easily get them to pay up for each violation you prove in court. Good luck!


If somebody uploads illegal content (like CSAM) today, does Facebook simply delete the account and call it a day? I'd hope for a system where the police handle enforcement and Facebook simply reports its.


If this was happening on my platform, I would want to know about it. Facebook is the police, of their own concerns. I'm sure they are genuinely trying to identify all these people and report them to law enforcement. What is the miss, here?

Any decent journalism on this subject? NPR takes too much money from Facebook to preclude them from rooting around in their business, and they pretty much stopped covering Facebook, other than mentions and basic AP press releases. The same is probably true with other media interests as well.

Nonetheless, I doubt Facebook wants these people and it certainly threatens their business by overrunning it with creepy drug ads, so what we see is the tiny bit that manages to bubble up through the cracks. Everyone wants Facebook to magically seal all the cracks.


> I'm sure they are genuinely trying to identify all these people and report them to law enforcement. What is the miss, here?

The miss is assuming any company would do something out of genuine concern for the law. Their shareholders don't reward them for morally upstanding behavior. They are concerned about money, and the law is only a concern insofar as it impacts their real concern. We've seen this story a million times people! It us what for-profit companies are. It is how they operate and that will never change.

What can change is government regulation and enforcement. That is the one and only answer to this problem.


Facebook is the only one with access to all the information to know that a crime has even happened. Practically, there is no way for someone on the outside to be able to detect many of the illegal things that Facebook is facilitating. Unless you want all of big tech's data to be explicitly funneled to law enforcement, this is not a solution, especially when Facebook's incentivized to turn a blind eye and keep collecting checks.


This is obviously not the case, otherwise how is anybody detecting the crimes in order to accuse Facebook of them?

If someone is running an ad to sell drugs or violate some other law, the users will see the ad and can report it to law enforcement. Law enforcement then investigates to find out who placed the ad and goes to arrest them.


I don't believe GP was suggesting Facebook do it; rather just pointing out that the second option is extremely costly and difficult, which is a consequence of the scale and influence of Facebook


> the second option is extremely costly and difficult, which is a consequence of the scale and influence of Facebook

Is it? Suppose that instead of Facebook we had a federated social media with ten thousand independent operators and ten thousand ad networks. Then there would be ten thousand ad agents (really automated websites) who compete on price and whose job it is to place your ad in the ten thousand ad networks.

Drug dealers would still try to run drug ads, wouldn't they? What would be any different? The internet has the scale that it does whether the service is centralized or not. Centralization/monopolies are the cause of many problems but not every problem.


Under DSA in the EU they are obliged to do that.


Since ads are targeted, only Facebook can know when an illegal ad is run. Are you suggesting Facebook should be reporting the people running ads to law enforcement?


All fb ads + associated general targeting are public in ads library, the gov could enforce this on their own.


Cool! TIL.


They claim the ads are targeted. Given that I've had ads for:

• Both dick pills and boob surgery

• A lawyer specialising in renouncing a citizenship I don't have for people who have moved to a country I don't live in

• Local news for a city in Florida that I didn't know existed (and I'm not an American resident or citizen)

• A library fun run in a city 903 km away with an entire extra country between me and it

• An announcement, from a government of a country I don't live in, that a breed of dog I've never heard of, is to be banned (and I'm not a dog owner)

I think that the claim "Meta knows how to target ads" is itself in the set of scams. The contents of my email's junk folder is significantly less wrong than this.


If they see people committing a crime, pretty much yes, they should report it.

In general, they should not impede investigations, and not intently help criminals.


If everybody is using an ad blocker then it doesn't really matter what ads they run, does it?

If some users don't use an ad blocker then there is obviously someone other than Facebook who knows when an illegal ad is run and can report it to law enforcement.


> law enforcement subpoenas Facebook to find out who it was

See, here’s the catch - this requires law enforcement to be a functional organization. Judging on how widespread drug abuse and drug dealers are, I don’t think this would be a winning approach.

Also, I wonder how newspapers dealt with this approach? Human vet everything?


It does seem like many of the web scale FAANG's had no scaling solution. Facebook and Google in particular are egregious beyond their marketing.

e.g. YouTube, how much copyrighted content does G profit off from. And then the 100 variations that slip the 'copyright algo.'

I see stuff on there that as a UK TV license payer I've already paid for, and then have to suffer ads. They could not give a hoot.


The last thing I'd complain about is YouTube not doing enough copyright enforcement. If anything, they do far too much, beyond what is reasonable or necessary! And in a lot of cases, the studios don't care, because they slurp up most of the profits, not Google.


They may do a lot of false positives but it doesn't excuse the fact their system can't scale correctly.


They basically do copyright enforcement in two cases:

1. Automatically, based almost entirely on sound, using ContentID - this is heavily weaponized, gamed, and generally over the top.

2. "Manually" (but really automatically) in response to requests from "verified" (sure) rights owners - this one is also heavily gamed, but seems to make up a much smaller fraction of takedowns.


YouTube is probably a bad example in terms of copyright. They in fact have the most restrictive system of reuse of others work in your work, as a large portion of the videos that are auto struck would probably err more on fair use.


I'm not in this space, but from the user's perspective I thought that youtube's "Demonitization" doesn't mean the ad doesn't show, just that the channel doesn't get any benefit from the show of the ad.

So they're actually incentivized to have many creators struck, because struck media means ad impressions whose revenue they need not share.

Am I misunderstanding the landscape?


> So they're actually incentivized to have many creators struck, because struck media means ad impressions whose revenue they need not share.

This revenue would go to the copyright claimer. EG if Warner bros struck you for showing their movie, the ad revenue from the video would go to Warner, not simply YouTube


I'm not privy to their deal, but while I'm confident that Warner Bros' has negotiated a revenue sharing agreement better than a typical Youtube Partners' 55% rate, I also suspect it's not 100%.


But that's the point really isn't it? It's no longer illegal once you have a negotiated contract with the copyright holder.


But YouTube still gets their cut, which is better for them than just deleting the video.


Yes and no. Youtube's moat is it's content creators. A gready algorithm might make them more money in the short run, but it would destroy their moat, as content creators migrate to other platforms.


I would disagree on the simple fact there's TerraBytes of copyrighted content on there, uploaded by people that don't own the content and subject to a fairly weak black box.


Same with providing support. They can't provide proper support at the scale they operate at, so they just neglect it completely and get away with it.

Also Amazon just completely gives up on quality assurance and let people ship whatever.


Seems pretty simple to me. They should pay crippling fees until their business shrinks to a point they can operate it legally. It’s unclear what benefit there is to society that an organization like Meta can scale beyond responsible operation. There is a natural self-regulation to hyperscaling. At some point you’re too big to exist. I see no reason we should forgive Meta’s incompetence. They’re not a natural monopoly. Let them fail.


"This is, I think, the real moral hazard you saw back in the 2007 financial crisis: companies can reach a scale where it's very costly to definitively assess who is to blame for crimes and can therefor commit any profitable crime up to a certain threshold. It both makes a mockery of the rule of law as a concept (along with many other things in the US legal system) and is an enormous competitive advantage for large companies. I'd include Uber's grey area stalking[2] and the eBay stalking campaign[3] in this category. "

If a company is too big to be managed properly it shouldn't exist. We saw that in 2008 with "Too big to fail" banks. I also remember the AG back then stating that some companies are too big to prosecute which is also big problem. Seems health insurances also have reached that scale where they can screw with people without consequences.


companies can reach a scale where it's very costly to definitively assess who is to blame for crimes and can therefor commit any profitable crime up to a certain threshold

Succinctly put. Just as LLCs function as a legal device to distributing and limiting financial risk (but strangely, not profits), they increasingly perform the same function for other kinds of legal liability. It's the worst of both worlds.


Ubers Greyball was an eye opener for me


Music has always been a great way to advertise drugs and other criminal acts, however, they never get chosen to get arrested.


I’d call that promoting drugs, which is not illegal. Advertising means there is a product for sale through the advertiser.


Meta has an obligation to ensure those they do business with are not criminal enterprises. The "advertising" you mention doesn't have a specific illicit enterprise paying for a call-out in the songs; it's just people expressing themselves.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: