> Stars (along with line art) are one of the most blatant tests for gamma correct rendering — if everything shimmers as you pan around, they are filtering in gamma space instead of linear space. It should all be rock solid.
Something to do with as the underlying values are shifted between pixels (as you pan around), they increase/decrease pixel brightness inconsistently if you're mapping the intrinsic values to some curve before calculating the appropriate pixel brightness? (I'm unfamiliar with "gamma correct rendering")
If a star is halfway between two pixels, its brightness should be distributed evenly between both pixels so that the total emitted light from that area of your screen is the same as if it was all on one pixel. Applying a nonlinear function to the values will usually mess that up.
Human perception of brightness is not linearly proportional to the amount of light emitted (if you're in a dark room with 1 candle, and light 1 more candle this makes a big difference, but 100 → 101 candles is imperceptible. Physically both cases add the same amount of light).
The relationship between perceived brightness and physical amount of light is approximately quadratic (~doubling amount of light is perceived as one incremental brightness step), which is called the gamma curve.
So in graphics you have to choose whether your numeric brightness values are on a scale relevant to humans (gamma compressed values), or whether they're on a scale that better models physical properties of light (AKA linear light).
This is generally confusing, and a lot of software gets it wrong (doesn't correct for the gamma curve where needed). For example, try blurring RGB red and green. In many programs it will give you a brown color, which is not the color of mixing red and green light, but a bad math on gamma-compressed values: √(A + B) ≠ √A + √B
> The relationship between perceived brightness and physical amount of light is approximately quadratic (~doubling amount of light is perceived as one incremental brightness step), which is called the gamma curve.
This sounds logarithmic to me, similar to the decibel system for audio loudness perception
> Stars (along with line art) are one of the most blatant tests for gamma correct rendering — if everything shimmers as you pan around, they are filtering in gamma space instead of linear space. It should all be rock solid.
Something to do with as the underlying values are shifted between pixels (as you pan around), they increase/decrease pixel brightness inconsistently if you're mapping the intrinsic values to some curve before calculating the appropriate pixel brightness? (I'm unfamiliar with "gamma correct rendering")