I’m getting old. I can understand the words, but not the content. At this point, show me dissassembly so that I can understand what actually happens on the fundamental byte/cpu instruction level, then I can figure out what you’re trying to explain.
Variance doesn't affect generated code because it acts earlier than that: determining whether code is valid or not and in doing so preventing invalid code (UB) from being compiled in the first place.
The simplest example of incorrect variance → UB is that `&'a mut T` must be invariant in T. If it were covariant, you could take a `&'a mut &'static T`, write a `&'b T` into it for some non-static lifetime `'b` (since `'static: 'b` for all `'b`), and then... kaboom. 'b ends but the compiler thought this was a `&'a mut &'static T`, and you've got a dangling reference.
`&'a mut T` can't be covariant in T for a similar reason: if you start with a `&'a mut &'b T`, contravariance would let you cast it to a `&'a mut &'static T`, and then you'd have a `&'static T` derived from a `&'b T`, which is again kaboom territory.
So, variance’s effect is to guide the compiler and prevent dangling references from occurring at runtime by making code that produces them invalid. Neither of the above issues is observable at runtime (barring compiler bugs) precisely because the compiler enforces variance correctly.
It doesn't have zero effect. Like everything about type systems, it helps prevent incorrect, and possibly unsound, code from compiling. So I guess the giant runtime effect is that you either have a program to run or not.