Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
TSA Says Theyre Not Discriminating Against Black Women but Body Scanners May Be (propublica.org)
43 points by NN88 on April 17, 2019 | hide | past | favorite | 82 comments


>"...Afros, braids, twists and other hairstyles popular among black women."

They literally say so in the subtitle. The discrimination exists before people enter the airport, and it comes in the form of what hairstyle to choose. This is partly done on a cultural basis, and partly because of a different hair composition. But the important thing is, the machine does not differentiate on the basis of race, it's a machine that detects areas of suspect density that could presumably be hiding contraband. I am a white man with long dense hair, and my hair triggers the machine almost every time. Not because I look like a hippy, and Nixon is still running the show, but because I could hide things in my hair.

The fact that certain hairstyles are more popular within a racial group IS the discriminating factor, not the machine.


People are free to choose their hair and hairstyles. The burden of not discriminating is on the government agency, which is funded and mandated (including, according to TFA, $100M for deploying the scanners) to not be discriminatory, including on characteristics related to race.

As u/tj-teej pointed out [0], the utilizers of this tech have the agency and means to improve or mitigate the tech, including prioritizing the training of algorithms to handle black hair styles instead of shunting it off to an edge case, conveniently burdening a demographic who has been historically limited in power and influence.

To think of it another way: in many parts of the U.S., Asians are very much a minority. Asians more commonly have narrower eyes which can interfere with face detection/recognition [1]. In the near and possible future, when face-detection becomes a "feature" in security tech as a means to expedite processing -- e.g. if your face is scanned and not found in a "Real ID" database, you're automatically put in a line for more invasive searching -- would Asian-Americans not have a legitimate case that the U.S. gov't has failed to improve their tech to the detriment of Asian-American citizens? Especially when it seems Asian countries have successfully mitigated the issue?

[0] https://news.ycombinator.com/item?id=19684290

[1] https://www.reuters.com/article/us-newzealand-passport-error...


Actually, I knew a guy who used to smuggle drugs into nightclubs in his afro, so targeting people on the basis of whether their hairstyle is physically capable of concealing contraband seems quite reasonable.

If you don't like it ... then you are free to change your hairstyle ...


Then given the sacredness/importance of human life, relative to the value of someone's hair, there is no reasonable justification for not requiring every airline passenger to shave their head before entering security, if it reduces the probability of the scanner having false negative even by a small margin.

Come to think of it, there's no justification for allowing passengers to keep their clothes on as they pass through security.


That is the endgame. Together with every human spending their life in a small isolated cage. Perfect security.


Perhaps we should submit travelers to regular cavity searches, because I know someone who would smuggle contraband (including phones) into jail through a certain orifice.


I will never discriminate against someone for something they can't reasonably change. But discriminating against them for something they consciously choose is totally fine in my opinion.


Cool story bro.


I think the major difference here is immutable vs mutable characteristics. You can choose your hairstyle, but not what your face looks like.


Eyes are mutable. Double-eyelid surgery is cheap and common enough in South Korea for it to be a graduation present, and as many as 1/3 of young women have undergone cosmetic surgery [0]. But the overriding point is that Asian countries have developed face detection and recognition for their populations, because the utility and value of the software would be dubious without it. For the U.S. to not seek to invest in its own software would be open for a legal challenge.

[0] https://www.npr.org/sections/parallels/2018/02/05/581765974/...


Yep same here. As a white dude with dense longish hair who travels a lot for work I've noticed that if I have my hair in a ponytail about 1 in 3 times that I go through the body scanner they are going to pat down the back of my head to check my ponytail.

I don't agree with your last point though. The discriminating factor is that the machine screening is designed based on the average white male and female appearance. And that means that anyone outside that built in assumption, whether its a white dude with a thick ponytail, or a black woman with a Afro gets extra screening.

What the article is trying to highlight is that when we design such systems we shouldn't just be designing them against the average because that causes the system to discriminate against anyone outside that average.


I am reminded of the issues the Air Force had due to cockpits being designed for the average pilot size. They did a study and found that the majority of pilots were not near the average size and had difficulty reaching for switches and pedals and such. [1]

1. https://www.thestar.com/news/insight/2016/01/16/when-us-air-...


The onus is on you if you make that declaration. You don't know that the reason the machine triggers more frequently in "black hair" is because of improper testing, you're just asserting that. There are such things as objective differences that have objective implications.


If the situation is as described, then I would agree with the idea that this shouldn't be treated as an "issue".

Discrimination's common definition has the negative connotation, but in this case, it seems to legitimately be "discriminating" based on ability to hide contraband, however in this case "discriminating" is synonymous with "differentiate".


The reason it is an issue is because the TSA is disproportionately singling out people for more invasive searches under the guise of, "we have this shitty technology that can't tell the difference between explosives and black hair or a 'turban', and that's what the procedure tells us to do." Just because it isn't designed to explicitly 'single out black people', that was the outcome of using this combination of technology and procedure.

To top it all off, we are all paying millions of dollars for this, via taxes for some fancy security theater to tell us, whether there is the "ability to hide contraband", which seems utterly insane.


Just because the technology isn't explicitly designed as a 'single out black people' machine, doesn't mean that it isn't disproportionately affecting them due to how it is being used. Just because this imaging technology can't tell the difference between dense hair and explosives does not make someone's thick hair, or hair covering "an area of suspect density" that should automatically be flagged for more invasive searchs.

It's definitely trying to split some really fine hairs to be saying 'actually, it isn't discriminating against black people, it's discriminating against hair types that are highly correlated with being a black person, and it's totally acceptable that people with those hair types should be considered suspect / subject to further scrutiny', as if that is some sort of edge case that couldn't have possibly be thought about beforehand.

I absolutely don't think we should be defending a tech / procedure that only seems to be negatively affecting people, costs millions, and doesn't seem to have any positives effects,


>The fact that certain hairstyles are more popular within a racial group IS the discriminating factor, not the machine.

Most of the people who ordered, designed, built and tested the machine are white.

They don't have to explicitly discriminate against blacks.

They can just go their business as usual, designing stuff based on their own norms and cultural habits, and it will come out as naturally discriminating.

If all those whites doing the machine's design had those "certain hairstyles" in the same ratio as blacks, and were flagged as much when they've tested it, they'd have considered this facet of the machine as useless and removed it...


So which detection facets regarding areas likely able to conceal hidden contraband were removed due to being part of the white people testing group?


It could also be the diversity of testing as well. If the machine is tested to work with such dense hair from the beginning the algorithms can be adjusted.

Kind of like the Amazon facial recognition, [0] that really didn't work well on minorities, likely due to the training models used

[0] https://medium.com/@bu64dcjrytwitb8/on-recent-research-audit...


> the machine does not differentiate on the basis of race

I’m guessing it was predominantly white men who built this machine and white men who tested it. Reminds me of certain facial-recognition programs being (initially) unable to identify black faces. It may not be racism, but it is discrimination.


Who is it racist against? Surely it's white people primarily who suffer as they're more likely to be caught by the facial recognition?

Following the simplest case, because lighter skin gives greater contrast is _not_ racism.

It's not 'unnecessary discrimination' either, as it arises out of the facts of how light works.

It's like saying that products with a weight limit that arises out of the materials used are "fattist".


I don't think you understand how light works.


You better explain it to me then. I'm not great on QFT, so keep it graduate level physics so I can follow along.


it's a machine that detects areas of suspect density that could presumably be hiding contraband.

And literacy tests for voters simply ensures that our electorate is properly educated. Any side effects are a convenient freebie.


[flagged]


> Ah yes, the good ol' You start out in 1954 by saying, “Nigger, nigger, nigger.”

You haven't got a shred of evidence of malicious intent, so you just toss around accusations of racist conspiracies. That's insane and shameful.

(I happen to think the TSA is misguided and unnecessary, but that's irrelevant to the question of discrimination.)

> If you build discrimination into the system you don't have to explicitly discriminate

There's a problem with people smuggling dangerous items on to aircraft, and the system is trying to address that to prevent people of all races dying.

The system has to find things that are hidden, and that means it has to discern between people who have more places to hide items on their body.

That design of the system is a natural consequence of the parameters of the problem, not some vast conspiracy to oppress minorities.


You are accidentally correct in saying that there is no conspiracy to oppress minorities. Conspiracy implies that it's hidden. It isn't.

We have mountains of malicious intent established and chronicled to the point where the burden of proof is not on those downstream of that malice to substantiate it. That mountain is conveniently labeled "American policing and jurisprudence relating thereto, 1620-Present".

> There's a problem with people smuggling dangerous items on to aircraft

That problem is a vanishingly small rounding error and the good-faith effort to fix it would be more air marshals. Instead, we get nonsense like this which, because the scanning devices look scary and dehumanize those who go through them, entrench a threat in the minds of the populace--because a threat must be serious, otherwise why would we be putting ourselves through all this?

These tools exist to scare people into seeking solutions to the problem they purport to, and yet do not, solve.


While we can see people disagree if this is discrimination or not, everyone agree that the system employed by the TSA is ineffective, humiliating and security theater. Air marshals are better. Dog patrols are great.

Only in areas which is very close to war would current methods be necessary, but you also need multiple checks which manually going through every single item in luggage and backpacks. This is only really possible in low traffic airports, which destinations close to wars usually is.


> There's a problem with people smuggling dangerous items on to aircraft

We have a lot of evidence that these machines are minimally effective against most terrorist threats.


How many terrorist incidents has TSA credibly prevented? The answer is zero.


How can you count events that never happened? Especially if they were deterred from even being attempted by the existence of TSA alone.

Do you think a government agency would just publicly blast all the unsuccessful attempts that were prevented, without fear that future criminals can look at those attempts and do better next time by learning how to avoid making the same mistakes?

Disclaimer: I agree with most of the posters that TSA is mostly just a security theater. But just because I think so, that doesn't make the flawed parent argument any better.


There's a gulf between publishing some basic statistics and blasting the specific details of all unsuccessful terrorist attacks that were thwarted.

Surely those statistics can be verified by, say, the House Committee on Homeland Security without needing to leak the details of specific attempts.


So can we agree that the system (which I'm taking to mean the combination of tech, people and procedure) is doing a really shit job then, because, beyond not doing what it's supposed to do, it's disproportionately singling out certain races?

Perhaps if it's singling out dense hair vs. actual places where people are hiding something, despite any evidence being presented that it's useful to do so, we should... I dunno, not use it? Definitely not spend millions of dollars on it.

I'm not sure what problem it's solving, when the solution involves humans mindlessly following a procedure, that dictates that a machine that happens to single out black hair and 'turbans' decides someone is suspect.


It's not "the machine". It's that people have genuinely variable ability to conceal dangerous items. You have to check the wheelchair of people who come in on a wheelchair, and you don't if they don't have one. That could be seen as unfair to the handicapped, but not checking the wheelchair isn't really an option.


I mean, it is an option.

The machine is one part of a hugely flawed system, and I think its totally valid to argue that we shouldn't be substituting a machine for, I dunno, actual human judgement.

The machine in this case doesn't seem to be demonstrably solving any problems, but it is disproportionately flagging black people, people with hair coverings ('turbans' in the article), etc. as suspicious, and that flagging is being used as part of a procedure to single those people out for further scrutiny. These systems don't seem to be solving any problems, and I'm not seeing a lot of defensible positions of why it should work that way, besides "eh, that's just how it is"


I don't read most of the comments here as defending the existence of the system, rather they're defending it from the accusation of deliberate malice. The system itself is deeply stupid security theatre, but was it deliberately designed to hassle black women or anyone else with dreadlocks?


Sure, and personally I'm not saying they literally set a bunch of engineers out to build a machine to explicitly target black people. I am saying that they probably didn't put the same level of testing there in that they did against disproportionately targeting white men in business suits when they were pitching investors and getting senators / the people who sign the checks to buy these things.

Or maybe they just didn't think about testing the machines against the diversity of actual people who go through an airport daily vs. who was available in the office at the time to test with.

Maybe they honestly believe that folks with natural black hairstyles, or who wear a 'turban' or ponytail, could be disproportionately hiding something suspicious up there, and we need a machine to tell us to check them out more often.

Hell, maybe they just didn't care at all who got flagged, as long as it didn't inconvenience them too much and they got paid at the end of the day. I mean they sure as shit don't mind the reality that this really does seem to be a lot of security theater with little benefit, besides the benefits to some peoples wallets.

Whether you want to define any of that as deliberate malice, willful ignorance, laziness, the banality of evil, or something else, is entirely up to your own values I suppose.


> was it deliberately designed to hassle black women or anyone else with dreadlocks?

Does it really matter when that's the end result and there is no intention or effort to fix it? Harm is harm, that the harm was intentional or not doesn't really matter when the harm is ongoing.

If your house is on fire, the first concern is to make it no be on fire, only after that does the question of intent have any relevance. And if somebody wilfully attempts to stop the process of making the house not be on fire, they should be considered intentionally harmful and no different from pyromaniacs until proven otherwise. Either way they should be removed post-haste.


How do you fix it? Program to ignore dense hair on black person and only alert is subject is white male?


Yes, clearly that is what everyone is saying. :-P

Maybe:

- Don't deploy a system that has this many false alerts (and that is what even the TSA calls them, they are not interested in finding dense hair)

- Stop continuing to use this stuff and spend money on it if we aren't seeing any benefit, and are seeing demonstrable harm.

- Don't replace human judgement with blind procedure that must be followed, when that procedure depends on a known flawed technology that isn't providing any demonstrable benefit.

- Accept that even if you want this kind of tech to exist, it isn't there yet, and you aren't going to "fix it in code" by adding a race variable.

Bottom line is, nobody is forcing us to use these machines / procedures. We aren't stuck with them and have to figure out some way to patch the code or we're all doomed, so we don't have to be so reductive about how we fix it. We could just say, this was a failed idea and throw it the hell out.


> The article is about the TSA, not Border Patrol. Looking for contraband is not the TSA's job.

You're absolutely right! Do you have a better word to describe the collection of types of items the TSA is empowered to and charged with searching for?


"Prohibited items." Not sure why this was downvoted. It is literally how the TSA refers to them on its site and answers the parent's question:

https://www.tsa.gov/travel/civil-enforcement

Save your downvotes for the TSA.


Explosives? Because the rest is BS -- e.g. toothpaste in quantities larger than 50 ml, etc...



Thank you for divining my true intentions, sagely one. Your argument inherently invalidates any possible explanation for mechanisms that produce differing population outcomes, unless that explanation is racism. Working backwards from default explanations is textbook signifier of ideological thinking.


>Looking for contraband is not the TSA's job.

Its certainly part of their job. In states with legal Marijuana sales, there are frequently TSA drug sniffing dogs at checkpoints.


Yeah, how dare they call it racism when it’s built into the system, and conveniently uses highly correlated features instead of race itself! We also don’t discriminate against men, only people called “John”, “David”, or “Jack”.


Counterpoint, If there's a known terrorist named "James Williams" and you flag everyone with that name going through the airport, yes you are going to be HIGHLY correlated with males, but the feature wasn't chosen by gender it was selected because of the increased risk of danger associated with that feature.


Maybe I missed attempted terrorism by malicious actors hiding explosives in their weaves? Has the TSA taken a break from not detecting 95% of weapons and explosives[0] to detect weapons hidden in hair?

The context to keep in mind is that the TSA has repeatedly demonstrated incompetence at their stated goal.

[0] https://www.theatlantic.com/politics/archive/2015/06/the-tsa...


Let's say the TSA used names more aggressively, for example checking everyone with the same name as anyone with a criminal record.

Now let's also assume African-Americans are disproportionately incarcerated, and therefore names associated with the ethnic group will be disproportionately represented in criminal records. I'm sure we can agree on that the system would then be biased towards a race.

The problem is that there isn't a single terrorist that caused this, in fact there's no good reason to believe that the hairstyles mentioned in the article are used to hide contraband.


"The fact that certain hairstyles are more popular within a racial group IS the discriminating factor, not the machine."

That is not a politically-acceptable argument.


If the machines triggered on hairstyles commonly worn by white men, what are the odds that they would have made it to production? Conversely, if the development teams had a significant amount of black women with hairstyles like these, what are the odds that this problem would exist?

The TSA wants to externalize its discrimination to "the machine", but the machine is a product of human systems, and reflects and captures the biases that already exist in society.


Exactly. This is similar to how women and children kept getting killed or injured by seatbelts simply because no one in the room thought to test for them since no one in the room looked like them.


A lot of people will say "this is a limitation of Computer Vision", but they are missing something.

If these kinds of technologies treat the hair of a certain race as a P1, then the technology (and people who designed the tech and the program) are participating in discrimination.

I'm not saying this to be 'woke' or to 'cancel' TSA. It's just a fact; and once we can all admit it (a lot of people in this thread don't seem to be able to), then we can make changes to fix it.

Editorial: If an aspect of traveling makes you feel like an 'other' in your own country, that's wrong, and I don't think public funds should be encouraging that. Public buildings have to have wheel-chair ramps, and TSA body scanners shouldn't be racialized.


Why does it need fixing per se? (Assume optional plurals)

If the parameter is incorrect, then the parameter should be adjusted correctly. Note that having an availability bias in the dataset is also incorrect, the parameter will reflect the incorrectness.

However, if the parameter is adjusted correctly: what is the problem other than people not wanting to be confronted with uncomfortable truths?


Read the article again. The machine has a problem with certain hairstyles.. not hair of certain races.

People of African descent have a propensity towards the problem hairstyle that skews the results.

If your argument is for 'de-racialization' then you need to convince people to change their hairstyles to make everyone more similar.

You can't blame a machine for treating a 1 as 1 and 0 as 0.


You're right you can't blame the machine. I'm blaming the people who designed it, who defined what the 1 and the 0 are.

If everyone in the country had the "certain hairstyles we'd have no problem saying "this machine doesn't work".

And frankly I don't think anyone who knows a lot of African-American Women would even consider asking them to change their hair.


You can't blame the people that designed it.

Some hairstyles are just bunched up and the machine can't tell what it is, so it gets flagged for manual inspection.

This is not a problem that is easily solved with technology.

Your only options are:

(1) not checking anyone's hair

(2) manually checking everyone's hair

(3) letting the machine do it's job and accept the fact that it can't scan certain hairstyles.


> If everyone in the country had the "certain hairstyles we'd have no problem saying "this machine doesn't work".

On the contrary, it would be easier to say that the machine does work because it does flag down people who have suspicious body features regardless of race.


My understanding is that they are not targeting a certain type of hair, rather a certain type of hair has characteristics that the machine cannot distinguish from potential risks.


If all hair had that problem then we'd say the machine doesn't work.

They launched this new tech knowing a "certain type of hair" (specifically black women's) triggers the machine (and triggers degrading searches). I'm saying this machine shouldn't be launched if that's the consequences, I'm saying that it's discriminatory.


Given that TSA has never stopped a terrorist, and this is all theater, choosing to continue makes it biased theater.


> Given that TSA has never stopped a terrorist

This is a pretty bold conclusion to make. Why do you assert this isn't so? There's no way to be certain of this because we can't read people's minds. A potential terrorist may have been thwarted simply because a TSA Agent refused to allow a weapon into the secured area.


Has TSA ever taken credit for stopping something? No. If they could plausibly claim they stopped a "major plot", it'd be 3-4 news cycles.

They've caught idiots with handguns and drugs in their luggage, which is fine, but there was no intention of doing anyone harm.

So no. They're as useless as a security blanket.


Since we cannot read people’s minds, we prove intent in the legal process through circumstantial evidence, or through direct evidence of intent such as communications or writings.

The TSA need not publicize their wins to prove effectiveness. Guilt can only be established in court; deterrence (which is what we’re talking about here) requires a much lower standard.


Your argument can also apply to the converse position, as well. Given the absence of evidence of success, can we reasonably accept that such successes do in fact exist? It'd be well within the TSA's interests to publicizing their successes, both for the agency's own operations and as a deterrent against would-be terrorists. They wouldn't even have to necessarily get into specifics.

Consider the TSA's bad press. There are years of stories detailing pervasive fail rates in detecting smuggled weapons or explosives by inspectors, misconduct and other scandals, and a number of viral videos showing incidents with groping and other bad behavior. If the TSA had major successes under their belt, it'd offer some rather useful political capital to deploy during the game of hardball that is the federal budget process. And the screeners' union, the AFGE, would be positively thrilled to be able to leverage those successes in support of extending full Title 5 protections and collective bargaining rights, not to mention General Schedule pay, to TSA screeners. Pushing back against the idea that screeners are incompetent would be of immense benefit to both the agency and the union.

Beyond that, a failed plot on US soil would lead to criminal charges against the conspirators. There should be evidence of major TSA successes in court records, and yet, nothing.


It's funny not funny that the same people that like to rage about government agency's being increasing intrusive and consuming more and more resources while doing nothing valuable have nothing bad to say about the poster child for that.


Just World Fallacy is great when you're unaffected.


> the same people that like to rage about government agency's being increasing intrusive and consuming more and more resources while doing nothing valuable have nothing bad to say about the poster child for that.

Don't they? The diehard libertarians I've been exposed to are the first and loudest to complain about the TSA. Is there some constituency I'm unaware of that's anti-government action but not anti-TSA?


> diehard libertarians

They bark and complain and then get on their knees and lick the boot.


Libertarian koan: Next year, that's gonna be my boot.


The short answer is libertarians are hostile to collective action. The only politically powerful group that aligns with is that of extreme wealth.


There's no way to tell how many terrorists have been dissuaded from making an attempt because of the existence of the TSA.


How conveniently unfalsifiable.


Indeed. I really don't think there's a way to know. That certainly includes counting the number bomb-carrying would-be terrorists caught red-handed at the TSA checkpoint.


Racism and bias in prediction is a problem. I’m not sure that’s what’s happening here.

The article says false alarm, but when you think about the goal here, a false alarm might just be a lack of ability to get an accurate read through black hair styles. A false alarm doesn’t mean they’re accusing someone. It means their tech isn’t good enough to do the job and they need a manual check to cover the gap.

Let’s say it is somehow technologically impossible to infer what’s hidden in an Afro. Wouldn’t you expect manual checks to occur? Otherwise what’s the point of doing checks at all? There would be an easy strategy: rock a ‘fro

Unless your argument is against the TSA being nothing but theater because the task is basically impossible against a good determined actor but that’s a separate issue.


It seems that this may be due to the limitations of machine vision. When the body scanners were being deployed, there was a lot of concern that they could see under clothes and that the TSA officials were basically going to be able to see everyone naked as they walked through the scanner. To deal with that, it was decided to use machine vision to create a stylized "Ken Doll" body and superimpose areas of concern, but not allow the actual image to be seen.

Now, with certain hairstyles, I would guess the algorithms can't reliably distinguish hairstyle from concealed items and just alert. I would guess that a human looking at the the imaging would be able to distinguish, but that would involve the TSA being able to see you naked again.


Sounds like a prototype being deployed to production. We all know how well that goes.


For all the people claiming that flagging dense hairstyles makes us safer...please name one terrorist who hid contraband in his hair.


Now that they've published the vulnerability, they can take advantage of that information in future plans.


Body scanners and current TSA processes also discriminate against transgender people. The body scanners are obvious (the agent has to push “male” or “female” and someone with “bulk” in their chest and crotch sets it off), but the X-Ray machines also alarm on dilators and lube so my bag is manually searched every time.

Sometimes I get a friendly agent and they send me on with minimal fuss, and sometimes they’re jerks and try to find a way to stop me from going thru security.


Interesting article about "flying while trans" in the NY times today: https://www.nytimes.com/2019/04/17/opinion/tsa-transgender.h...

Yet another reason to opt out of the body scanners. No one wants a pat down due to a machine deciding that their genitals are an "anomaly"


Yep. I fly multiple times a week. It’s an awful, dehumanizing experience every time but I refuse to allow my activities to be restricted because of that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: