Bit of a tangent -- I am not sure about how our brains deal with ambiguity vs sharp boundaries, but we can see a neural network change its tune quite a bit by adjusting this single number (i.e. the temperature parameter of Softmax function).
This trick has been used for knowledge distillation (KD) as well. Hinton et. al. wrote this quite approachable paper on the topic: https://arxiv.org/pdf/1503.02531.pdf
Turns out, one _needs_ nuance when one is trying to do KD. I am not sure if nuance and ambiguity as you describe are the same thing. But, even though it does not _feel_ like it, good generalizations over the lifetime of an organism might require dealing with nuance appropriately.
A subject matter expert might have more to say about this. I cannot imagine that this hasn't been researched in humans.
This trick has been used for knowledge distillation (KD) as well. Hinton et. al. wrote this quite approachable paper on the topic: https://arxiv.org/pdf/1503.02531.pdf
Turns out, one _needs_ nuance when one is trying to do KD. I am not sure if nuance and ambiguity as you describe are the same thing. But, even though it does not _feel_ like it, good generalizations over the lifetime of an organism might require dealing with nuance appropriately.
A subject matter expert might have more to say about this. I cannot imagine that this hasn't been researched in humans.