Any physical theory except for classical wave mechanics...
(this is not an objection to the paper, which claims relevance only for a highly technical class of models called "general probabilistic theories," which carry in enough assumptions to make the claim true, but rather a warning about not interpreting the title apart from the abstract)
They are not all that different until two particles or two strings are involved. In quantum mechanics, "either one string is vibrating or the other" is a state the strings can be in, and a state that can be added or subtracted from other states. In classical mechanics you can say "half of the energy is in a low frequency vibration and half is in a high frequency vibration," which is roughly equivalent (mathematically, not conceptually) to a single particle either being in a high or low state - but as for two strings, either one is vibrating or the other, and you might not know which one is vibrating, but one is.
When waves overlap on a guitar string they are two different waves whose state covers the same area. A wave superposition in an atom is a single wave that has a position state that's more complex than being in a single location.
Superposition is a mathematical conception, while the wave in a physical guitar string is a physical thing.
Superposition is the mathematical description of _our_ uncertainty about position of a particle.
The physical wave in a guitar string does redistribution of mass in the string using energy. The physical wave does some work, while superposition does nothing, because superposition is used for predictions only. When we know the location or other properties of a particle after a measurement, then its superposition is meaningless.
They share nothing in common in the physical world, while they share a lot in mathematical description of their behavior.
However, as underappreciated as it is, statistical theories are efficient maps, not territories. Quantum mechanics is a clever way to maximize knowledge and minimize unknowns but in all honesty it does not represent at all reality, only a knowledge map, and therefore famous paradoxes become apparent artifacts of those leaking abstractions. Theories that have actual causal content would necessitate some explanations of what is going on, AKA hidden variables theories, however the search space is huge and the usefulness not that high.
However the double slit experiment is still a real mystery though.
If some readers have some cognitive resources to allocate though I would much prefer them to analyze this extremely underlooked paper that is the first on earth to make canonical quantization of gravity works for quantum mechanics:
https://arxiv.org/pdf/gr-qc/9706055.pdf
This seems like an asserted distinction with no content.
> statistical theories are efficient maps ... Quantum mechanics ... does not represent at all reality
I don't see how this is true at all. It just seems like quibble between the difference between something being a description of a phenomenon vs something making predictions correlated with experiment. Like what is the difference.
Seems like you are smuggling an assumption like, "we know reality isn't really this way" But you don't
A paper that spends most of its introduction on opinions about permissible modes of reasoning under social convention is starting out badly. Everything expressed may be necessary, but it would be better to come to them than to start out with them.
It is needed given the strong cognitive biases at hand here e.g. https://en.wikipedia.org/wiki/Sunk_cost#Fallacy_effect
what prevent progress towards quantum gravity has more to do with social inertia than technicalities tbh.
We see a length contraction, but that aside, I've always thought of this question as super interesting.
From the POV of the photon for example, the universe is basically flat. A 2d timeless plane. From this POV, the double-slit experimental results seem much more clear, it's more intuitive that the photon can both interfere with itself and appear in just one spot, as it experiences both the number of slits and hits the screen simultaneously.
From the photon's POV, there's no 'delayed choice quantum eraser'.
I believe the main issue with this line of reasoning, that length/time contraction are instrumental to entanglement is that entanglement phenomena like the double-slit also works with larger particles that are not necessarily travelling near the speed of light.
(this is not an objection to the paper, which claims relevance only for a highly technical class of models called "general probabilistic theories," which carry in enough assumptions to make the claim true, but rather a warning about not interpreting the title apart from the abstract)