Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>"...Afros, braids, twists and other hairstyles popular among black women."

They literally say so in the subtitle. The discrimination exists before people enter the airport, and it comes in the form of what hairstyle to choose. This is partly done on a cultural basis, and partly because of a different hair composition. But the important thing is, the machine does not differentiate on the basis of race, it's a machine that detects areas of suspect density that could presumably be hiding contraband. I am a white man with long dense hair, and my hair triggers the machine almost every time. Not because I look like a hippy, and Nixon is still running the show, but because I could hide things in my hair.

The fact that certain hairstyles are more popular within a racial group IS the discriminating factor, not the machine.



People are free to choose their hair and hairstyles. The burden of not discriminating is on the government agency, which is funded and mandated (including, according to TFA, $100M for deploying the scanners) to not be discriminatory, including on characteristics related to race.

As u/tj-teej pointed out [0], the utilizers of this tech have the agency and means to improve or mitigate the tech, including prioritizing the training of algorithms to handle black hair styles instead of shunting it off to an edge case, conveniently burdening a demographic who has been historically limited in power and influence.

To think of it another way: in many parts of the U.S., Asians are very much a minority. Asians more commonly have narrower eyes which can interfere with face detection/recognition [1]. In the near and possible future, when face-detection becomes a "feature" in security tech as a means to expedite processing -- e.g. if your face is scanned and not found in a "Real ID" database, you're automatically put in a line for more invasive searching -- would Asian-Americans not have a legitimate case that the U.S. gov't has failed to improve their tech to the detriment of Asian-American citizens? Especially when it seems Asian countries have successfully mitigated the issue?

[0] https://news.ycombinator.com/item?id=19684290

[1] https://www.reuters.com/article/us-newzealand-passport-error...


Actually, I knew a guy who used to smuggle drugs into nightclubs in his afro, so targeting people on the basis of whether their hairstyle is physically capable of concealing contraband seems quite reasonable.

If you don't like it ... then you are free to change your hairstyle ...


Then given the sacredness/importance of human life, relative to the value of someone's hair, there is no reasonable justification for not requiring every airline passenger to shave their head before entering security, if it reduces the probability of the scanner having false negative even by a small margin.

Come to think of it, there's no justification for allowing passengers to keep their clothes on as they pass through security.


That is the endgame. Together with every human spending their life in a small isolated cage. Perfect security.


Perhaps we should submit travelers to regular cavity searches, because I know someone who would smuggle contraband (including phones) into jail through a certain orifice.


I will never discriminate against someone for something they can't reasonably change. But discriminating against them for something they consciously choose is totally fine in my opinion.


Cool story bro.


I think the major difference here is immutable vs mutable characteristics. You can choose your hairstyle, but not what your face looks like.


Eyes are mutable. Double-eyelid surgery is cheap and common enough in South Korea for it to be a graduation present, and as many as 1/3 of young women have undergone cosmetic surgery [0]. But the overriding point is that Asian countries have developed face detection and recognition for their populations, because the utility and value of the software would be dubious without it. For the U.S. to not seek to invest in its own software would be open for a legal challenge.

[0] https://www.npr.org/sections/parallels/2018/02/05/581765974/...


Yep same here. As a white dude with dense longish hair who travels a lot for work I've noticed that if I have my hair in a ponytail about 1 in 3 times that I go through the body scanner they are going to pat down the back of my head to check my ponytail.

I don't agree with your last point though. The discriminating factor is that the machine screening is designed based on the average white male and female appearance. And that means that anyone outside that built in assumption, whether its a white dude with a thick ponytail, or a black woman with a Afro gets extra screening.

What the article is trying to highlight is that when we design such systems we shouldn't just be designing them against the average because that causes the system to discriminate against anyone outside that average.


I am reminded of the issues the Air Force had due to cockpits being designed for the average pilot size. They did a study and found that the majority of pilots were not near the average size and had difficulty reaching for switches and pedals and such. [1]

1. https://www.thestar.com/news/insight/2016/01/16/when-us-air-...


The onus is on you if you make that declaration. You don't know that the reason the machine triggers more frequently in "black hair" is because of improper testing, you're just asserting that. There are such things as objective differences that have objective implications.


If the situation is as described, then I would agree with the idea that this shouldn't be treated as an "issue".

Discrimination's common definition has the negative connotation, but in this case, it seems to legitimately be "discriminating" based on ability to hide contraband, however in this case "discriminating" is synonymous with "differentiate".


The reason it is an issue is because the TSA is disproportionately singling out people for more invasive searches under the guise of, "we have this shitty technology that can't tell the difference between explosives and black hair or a 'turban', and that's what the procedure tells us to do." Just because it isn't designed to explicitly 'single out black people', that was the outcome of using this combination of technology and procedure.

To top it all off, we are all paying millions of dollars for this, via taxes for some fancy security theater to tell us, whether there is the "ability to hide contraband", which seems utterly insane.


Just because the technology isn't explicitly designed as a 'single out black people' machine, doesn't mean that it isn't disproportionately affecting them due to how it is being used. Just because this imaging technology can't tell the difference between dense hair and explosives does not make someone's thick hair, or hair covering "an area of suspect density" that should automatically be flagged for more invasive searchs.

It's definitely trying to split some really fine hairs to be saying 'actually, it isn't discriminating against black people, it's discriminating against hair types that are highly correlated with being a black person, and it's totally acceptable that people with those hair types should be considered suspect / subject to further scrutiny', as if that is some sort of edge case that couldn't have possibly be thought about beforehand.

I absolutely don't think we should be defending a tech / procedure that only seems to be negatively affecting people, costs millions, and doesn't seem to have any positives effects,


>The fact that certain hairstyles are more popular within a racial group IS the discriminating factor, not the machine.

Most of the people who ordered, designed, built and tested the machine are white.

They don't have to explicitly discriminate against blacks.

They can just go their business as usual, designing stuff based on their own norms and cultural habits, and it will come out as naturally discriminating.

If all those whites doing the machine's design had those "certain hairstyles" in the same ratio as blacks, and were flagged as much when they've tested it, they'd have considered this facet of the machine as useless and removed it...


So which detection facets regarding areas likely able to conceal hidden contraband were removed due to being part of the white people testing group?


It could also be the diversity of testing as well. If the machine is tested to work with such dense hair from the beginning the algorithms can be adjusted.

Kind of like the Amazon facial recognition, [0] that really didn't work well on minorities, likely due to the training models used

[0] https://medium.com/@bu64dcjrytwitb8/on-recent-research-audit...


> the machine does not differentiate on the basis of race

I’m guessing it was predominantly white men who built this machine and white men who tested it. Reminds me of certain facial-recognition programs being (initially) unable to identify black faces. It may not be racism, but it is discrimination.


Who is it racist against? Surely it's white people primarily who suffer as they're more likely to be caught by the facial recognition?

Following the simplest case, because lighter skin gives greater contrast is _not_ racism.

It's not 'unnecessary discrimination' either, as it arises out of the facts of how light works.

It's like saying that products with a weight limit that arises out of the materials used are "fattist".


I don't think you understand how light works.


You better explain it to me then. I'm not great on QFT, so keep it graduate level physics so I can follow along.


it's a machine that detects areas of suspect density that could presumably be hiding contraband.

And literacy tests for voters simply ensures that our electorate is properly educated. Any side effects are a convenient freebie.


[flagged]


> Ah yes, the good ol' You start out in 1954 by saying, “Nigger, nigger, nigger.”

You haven't got a shred of evidence of malicious intent, so you just toss around accusations of racist conspiracies. That's insane and shameful.

(I happen to think the TSA is misguided and unnecessary, but that's irrelevant to the question of discrimination.)

> If you build discrimination into the system you don't have to explicitly discriminate

There's a problem with people smuggling dangerous items on to aircraft, and the system is trying to address that to prevent people of all races dying.

The system has to find things that are hidden, and that means it has to discern between people who have more places to hide items on their body.

That design of the system is a natural consequence of the parameters of the problem, not some vast conspiracy to oppress minorities.


You are accidentally correct in saying that there is no conspiracy to oppress minorities. Conspiracy implies that it's hidden. It isn't.

We have mountains of malicious intent established and chronicled to the point where the burden of proof is not on those downstream of that malice to substantiate it. That mountain is conveniently labeled "American policing and jurisprudence relating thereto, 1620-Present".

> There's a problem with people smuggling dangerous items on to aircraft

That problem is a vanishingly small rounding error and the good-faith effort to fix it would be more air marshals. Instead, we get nonsense like this which, because the scanning devices look scary and dehumanize those who go through them, entrench a threat in the minds of the populace--because a threat must be serious, otherwise why would we be putting ourselves through all this?

These tools exist to scare people into seeking solutions to the problem they purport to, and yet do not, solve.


While we can see people disagree if this is discrimination or not, everyone agree that the system employed by the TSA is ineffective, humiliating and security theater. Air marshals are better. Dog patrols are great.

Only in areas which is very close to war would current methods be necessary, but you also need multiple checks which manually going through every single item in luggage and backpacks. This is only really possible in low traffic airports, which destinations close to wars usually is.


> There's a problem with people smuggling dangerous items on to aircraft

We have a lot of evidence that these machines are minimally effective against most terrorist threats.


How many terrorist incidents has TSA credibly prevented? The answer is zero.


How can you count events that never happened? Especially if they were deterred from even being attempted by the existence of TSA alone.

Do you think a government agency would just publicly blast all the unsuccessful attempts that were prevented, without fear that future criminals can look at those attempts and do better next time by learning how to avoid making the same mistakes?

Disclaimer: I agree with most of the posters that TSA is mostly just a security theater. But just because I think so, that doesn't make the flawed parent argument any better.


There's a gulf between publishing some basic statistics and blasting the specific details of all unsuccessful terrorist attacks that were thwarted.

Surely those statistics can be verified by, say, the House Committee on Homeland Security without needing to leak the details of specific attempts.


So can we agree that the system (which I'm taking to mean the combination of tech, people and procedure) is doing a really shit job then, because, beyond not doing what it's supposed to do, it's disproportionately singling out certain races?

Perhaps if it's singling out dense hair vs. actual places where people are hiding something, despite any evidence being presented that it's useful to do so, we should... I dunno, not use it? Definitely not spend millions of dollars on it.

I'm not sure what problem it's solving, when the solution involves humans mindlessly following a procedure, that dictates that a machine that happens to single out black hair and 'turbans' decides someone is suspect.


It's not "the machine". It's that people have genuinely variable ability to conceal dangerous items. You have to check the wheelchair of people who come in on a wheelchair, and you don't if they don't have one. That could be seen as unfair to the handicapped, but not checking the wheelchair isn't really an option.


I mean, it is an option.

The machine is one part of a hugely flawed system, and I think its totally valid to argue that we shouldn't be substituting a machine for, I dunno, actual human judgement.

The machine in this case doesn't seem to be demonstrably solving any problems, but it is disproportionately flagging black people, people with hair coverings ('turbans' in the article), etc. as suspicious, and that flagging is being used as part of a procedure to single those people out for further scrutiny. These systems don't seem to be solving any problems, and I'm not seeing a lot of defensible positions of why it should work that way, besides "eh, that's just how it is"


I don't read most of the comments here as defending the existence of the system, rather they're defending it from the accusation of deliberate malice. The system itself is deeply stupid security theatre, but was it deliberately designed to hassle black women or anyone else with dreadlocks?


Sure, and personally I'm not saying they literally set a bunch of engineers out to build a machine to explicitly target black people. I am saying that they probably didn't put the same level of testing there in that they did against disproportionately targeting white men in business suits when they were pitching investors and getting senators / the people who sign the checks to buy these things.

Or maybe they just didn't think about testing the machines against the diversity of actual people who go through an airport daily vs. who was available in the office at the time to test with.

Maybe they honestly believe that folks with natural black hairstyles, or who wear a 'turban' or ponytail, could be disproportionately hiding something suspicious up there, and we need a machine to tell us to check them out more often.

Hell, maybe they just didn't care at all who got flagged, as long as it didn't inconvenience them too much and they got paid at the end of the day. I mean they sure as shit don't mind the reality that this really does seem to be a lot of security theater with little benefit, besides the benefits to some peoples wallets.

Whether you want to define any of that as deliberate malice, willful ignorance, laziness, the banality of evil, or something else, is entirely up to your own values I suppose.


> was it deliberately designed to hassle black women or anyone else with dreadlocks?

Does it really matter when that's the end result and there is no intention or effort to fix it? Harm is harm, that the harm was intentional or not doesn't really matter when the harm is ongoing.

If your house is on fire, the first concern is to make it no be on fire, only after that does the question of intent have any relevance. And if somebody wilfully attempts to stop the process of making the house not be on fire, they should be considered intentionally harmful and no different from pyromaniacs until proven otherwise. Either way they should be removed post-haste.


How do you fix it? Program to ignore dense hair on black person and only alert is subject is white male?


Yes, clearly that is what everyone is saying. :-P

Maybe:

- Don't deploy a system that has this many false alerts (and that is what even the TSA calls them, they are not interested in finding dense hair)

- Stop continuing to use this stuff and spend money on it if we aren't seeing any benefit, and are seeing demonstrable harm.

- Don't replace human judgement with blind procedure that must be followed, when that procedure depends on a known flawed technology that isn't providing any demonstrable benefit.

- Accept that even if you want this kind of tech to exist, it isn't there yet, and you aren't going to "fix it in code" by adding a race variable.

Bottom line is, nobody is forcing us to use these machines / procedures. We aren't stuck with them and have to figure out some way to patch the code or we're all doomed, so we don't have to be so reductive about how we fix it. We could just say, this was a failed idea and throw it the hell out.


> The article is about the TSA, not Border Patrol. Looking for contraband is not the TSA's job.

You're absolutely right! Do you have a better word to describe the collection of types of items the TSA is empowered to and charged with searching for?


"Prohibited items." Not sure why this was downvoted. It is literally how the TSA refers to them on its site and answers the parent's question:

https://www.tsa.gov/travel/civil-enforcement

Save your downvotes for the TSA.


Explosives? Because the rest is BS -- e.g. toothpaste in quantities larger than 50 ml, etc...



Thank you for divining my true intentions, sagely one. Your argument inherently invalidates any possible explanation for mechanisms that produce differing population outcomes, unless that explanation is racism. Working backwards from default explanations is textbook signifier of ideological thinking.


>Looking for contraband is not the TSA's job.

Its certainly part of their job. In states with legal Marijuana sales, there are frequently TSA drug sniffing dogs at checkpoints.


Yeah, how dare they call it racism when it’s built into the system, and conveniently uses highly correlated features instead of race itself! We also don’t discriminate against men, only people called “John”, “David”, or “Jack”.


Counterpoint, If there's a known terrorist named "James Williams" and you flag everyone with that name going through the airport, yes you are going to be HIGHLY correlated with males, but the feature wasn't chosen by gender it was selected because of the increased risk of danger associated with that feature.


Maybe I missed attempted terrorism by malicious actors hiding explosives in their weaves? Has the TSA taken a break from not detecting 95% of weapons and explosives[0] to detect weapons hidden in hair?

The context to keep in mind is that the TSA has repeatedly demonstrated incompetence at their stated goal.

[0] https://www.theatlantic.com/politics/archive/2015/06/the-tsa...


Let's say the TSA used names more aggressively, for example checking everyone with the same name as anyone with a criminal record.

Now let's also assume African-Americans are disproportionately incarcerated, and therefore names associated with the ethnic group will be disproportionately represented in criminal records. I'm sure we can agree on that the system would then be biased towards a race.

The problem is that there isn't a single terrorist that caused this, in fact there's no good reason to believe that the hairstyles mentioned in the article are used to hide contraband.


"The fact that certain hairstyles are more popular within a racial group IS the discriminating factor, not the machine."

That is not a politically-acceptable argument.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: