Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> only look good when processed

Ehh. I personally can’t stand all the post processing folks love to make their results look “magazine ready”. I think the most minimal transformation possible to map the data into 0xRRGGBB would look the best, ideally with a simple standardized algorithm that doesn’t allow for any “artistic license”.



Everything Webb sees is infrared light, which is invisible to humans, so you have to do some processing to make them look good.


Yes, that is what the map does. Convert values from one domain to another. That is the purpose of mapping. The point is to make it as simple and consistent as possible.


Are they not doing that? What is the origin of the idea that they sit on the images too long choosing a custom wavelength conversion formula? Is it just that their images look good?


I encourage you to try some raw file photography and processing. A bitstream from a sensor is not an image and there is no “correct” or “accurate” image from a captured signal.

Think of it like using a linear or log scale for a chart axis: neither is “more correct”, neither is taking “artistic license”.


Poor example, given many photographs shoot raw precisely because it gives them more room for artistic decisions in post. Obviously the standardized algorithm should have basic factors like gamma and general phase shifting incorporated, but the idea of being able to adjust the maps delta between arbitrary adjacent inputs is of questionable benefit to the community. It’s akin to adjusting levels via curves with many points, and it’d be incorrect to say folks are taking artistic liberties when they do that.


IR pictures would be not RGB, just black/white (or whatever palette you like). But yes, it would be possible. For ex. from Flir Thermal cameras, you also can output the image just as spreadsheet. You can even choose the values as temperature (which is calculated) or just as energy (what the sensor gets).


Light isn’t RGB. We just receptors that react to certain wavelengths. I suspect the sensors on the telescope have a range of wavelengths they are sensitive to. It would be a straightforward translation to shift it to map it to the visible spectrum without varying the relative intensities to accentuate certain aspects of the image.


The IR may be in a very narrow band. The visible wavelengths have different colors because there are a range of wavelengths that correspond to different cones in our eyes that roughly match red:green:blue sensors. If you shift the IR frequency up into the visible range, you would just get a luminance image (like grayscale) centered on one visible wavelength like red.

False color imaging sometimes applies colors to different luminance levels or sometimes it takes multiple images at different wavelengths and assigns RGB values to each of those wavelengths. The results are informative but require some editorial / aesthetic decisions to produce the best results.


That's not how vision works. You see an extremely post processed image that's extremely far away from the original light that hit your retinas. There's nothing at all privileged about shifting something into the visible spectrum directly and seeing junk. You're just making an image that your visual system isn't good at understanding. It's not pure, it's garbage. You would hallucinate things that aren't there, you would miss obvious things that are there, etc. For you to really comprehend something the transformation needs to be designed.


That's what they do already. Each wavelength that comes from different atoms gets a different color.


It would be nice if they stopped with the false color, and just scaled it to whatever color an astrounaut might see from that point of view.


Given that these images are infrared, that wouldn't be much.


Besides the wavelength being outside of human perception, an astronaut wouldn't see anything due to the low photon flux. These pictures have a very high exposure time.


just bring it up in an editor and drop the saturation to zero. That will take it back to a luminance map image.


You have 2 choices for these images, false colour or black and white. Anything else is false.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: