Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Terrestrial cameras don't behave that way. They apply tone curves from the beginning. They have to pick a white balance. They have to cram an HDR signal into an 8-bit image. They have to decide exactly how to process the color- every film stock and digital camera renders color a bit differently, even when ones that are all trying to produce true-color images and there are humans around who can compare it with their own subjective perceptions. The simplest, linearest way to do it is probably wrong, due to mismatches between human color perception and the camera's sensor / film stock. Eg, human rods and cones almost certainly don't have the same frequency response as the color filters inside your camera, and that's just the beginning.

Anyway, this post shows an example comparing a "flat" color composite and one that's been tonemapped etc. This is using Hubble data but it's the same subject as one of the JWST images.

https://www.rocketstem.org/2015/04/20/how-astronomers-proces...

This video goes into some detail about the filters that were used for one of these JWST images:

https://www.youtube.com/watch?v=zAbI8bux-jM



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: