In the old days, something similar to what you're calling "poisoned" was called "tainted" [0].
In those scenarios, tainted variables were ones which were read from untrusted sources, so could cause unexpected behaviour if made part of SQL strings, shell commands, or used to assemble html pages for users. Taint checking was a way of preventing potentially dangerous variables being sent to vulnerable places.
In your scenario, poisoned variables function similarly, but with "untrusted" and "vulnerable" being replaced with "secret" and "public" respectively. Variables read from privacy-compromising sources (e.g. screen size) become poisoned, and poisoned values can't be written to public locations like urls.
There's still some potential to leak information without using the poisoned variables directly, based on conditional behaviour - some variation on
if posioned_screenwidth < poisoned_screenheight then load(mobile_css) else load(desktop_css)
is sufficient to leak some info about poisoned variables, without specifically building URLs with the information included.
In those scenarios, tainted variables were ones which were read from untrusted sources, so could cause unexpected behaviour if made part of SQL strings, shell commands, or used to assemble html pages for users. Taint checking was a way of preventing potentially dangerous variables being sent to vulnerable places.
In your scenario, poisoned variables function similarly, but with "untrusted" and "vulnerable" being replaced with "secret" and "public" respectively. Variables read from privacy-compromising sources (e.g. screen size) become poisoned, and poisoned values can't be written to public locations like urls.
There's still some potential to leak information without using the poisoned variables directly, based on conditional behaviour - some variation on
is sufficient to leak some info about poisoned variables, without specifically building URLs with the information included.[0] https://en.wikipedia.org/wiki/Taint_checking