1. camera manufacturers and film crews both do their best to produce a noise-free image
2. in post-production, they add fake noise to the image so it looks more "cinematic"
3. to compress better, streaming services try to remove the noise
4. to hide the insane compression and make it look even slightly natural, the decoder/player adds the noise back
> camera manufacturers and film crews both do their best to produce a noise-free image
This is not correct, camera manufacturers and filmakers engineer _aesthetically pleasing_ noise (randomized grains appear smoother to the human eye than clean uniform pixels). The rest is still as silly as it sounds.
Considering how much many camera brands boast their super low noise sensors, I'd still say a very common goal is to have as little noise as possible and then let the director/dop/colorist add grain to their liking. Even something like ARRI's in-camera switchable grain profiles requires a low-noise sensor to begin with.
But yes, there are definitely also many DPs that like their grain baked-in and camera companies that design cameras for that kind of use.
In any case, luma noise is not at all a massive issue, and it is a mistake to say that crews do their best to produce a noise-free image. They do their best to produce an image that they want to see, and some amount of luma noise is not a deal-breaker. There are routinely higher priorities that will take over using the lowest ISO possible. They can also be financial considerations, if you don’t have enough lights.
> randomized grains appear smoother to the human eye than clean uniform pixels
Does this explain why i dislike 4K content on a 4K TV? Where some series and movies look too realistic, what in turn gives me a amateur film feeling (like somebody made a movie with a smartphone).
Are you sure? That seems to be about motion interpolation. Not even about smooth motion, but about it being interpolated. Here the concern seems to be about how the individual, still images look, not anything about the motion between them.
> 1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic"
This is patently wrong. The rest builds up on this false premise.
1.1: Google some low-light performance reviews of cinema cameras - you'll see that ISO noise is decreasing with every generation and that some cameras (like from Sony) have that as a selling feature.
1.2.: Google "how to shoot a night scene" or something like that. You'll find most advice goes something along the lines of "don't crank the ISO up, add artificial lighting to brighten shadows instead". When given a choice, you'll also find cinematographers use cameras with particularly good low-light performance for dark scenes (that's why dark scenes in Planet Earth were shot on the Sony A7 - despite the "unprofessional" form factor, it had simply the best high-ISO performance at the time)
2: Google "film grain effect". You'll find a bunch of colorists explaining why film grain is different from ISO noise and why and how you should add it artificially to your films.
I am reasonably confident that I know this subject area better than one could learn from three Google searches. “ISO noise” is not even a real term; you are talking about digital sensor noise, which can further be luminance noise or colour noise. Opinions about luma noise being “unaesthetic” are highly subjective, depending a lot on stylistic choices and specific sensor, and some would say luma noise on professional sensors has not been ugly for more than a decade. Generally, the main reason I don’t think your comments should be taken seriously is because you are basically talking only about a subset of photography, digital photography, while failing to acknowledge that (works are still shot on film in 2025 and will be in years to come).
"ISO noise" is a thing I've heard on a few different film sets and it's a convenient shorthand for "the noise that becomes really apparent only when you crank up the gain". Now that I'm thinking about it, there's a good chance this is more of a thing where I'm from, since our native language doesn't have different words for grain and noise, so we have to diffentiate between them with a suffix, which we then incorrectly also use in English. I guess on a film set with mostly native English speakers, noise and grain is clear enough.
Next, no shit aesthetics are subjective, I never said this is the one objective truth. I said this is a thing that many people believe, as evidenced by the plethora of resources talking about the difference between noise and grain, why tasteful grain is better than a completely clean image and how to add it in post.
And finally, come on, it's obvious to everyone in this thread that I'm referring to digital, which is also not "just a subset" it's by far the biggest subset.
So idk what your point is. Most things are shot digitally. Most camera companies try to reduce sensor noise. Most camera departments try to stick to the optimal ISO for their sensor, both for dynamic range and for noise reasons, adjusting exposure with other means. In my experience, most people don't like the look of sensor/gain/iso/whatever noise. Many cinematographers and directors like the look of film grain and so they often ask for it to be added in post.
Besides the many/most/some qualifiers possibly not matching with how you percieve this (which is normal, we're all different, watch different content, work in different circles...), where exactly am I wrong?
I can assure you that “ISO noise” is not a real term. I would take the word of whoever uses it with a grain of salt, movie set or not. Words have meanings.
> it's obvious to everyone in this thread that I'm referring to digital
It was blindingly obvious that you meant digital. That’s why I pointed this out. Without mentioning that it is only a concern with digital photography, your points become factually incorrect on more than one level; because the thread wasn’t talking specifically about digital photography, some of your points about noise don’t apply even if they were correct—which they aren’t, by your own admission that photography is subjective. Producing a noise-free image is not the highest priority for film crews (for camera manufacturers it is, but that’s because it means more flexibility in different conditions; it does not mean film crews will always prioritize whatever settings give them the lowest noise, there are plenty of higher priorities), and in some cases they choose to produce image with some noise despite the capability to avoid it.
Sorry, with your googling suggestions it just reads like a newbie’s take on subject matter.
No need to put words in my mouth. First, the points were made in context of photography as a whole, and in that context, without specifying that they only apply in digital, they are false. Second, even in digital they are false. That’s all.
1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic" 3. to compress better, streaming services try to remove the noise 4. to hide the insane compression and make it look even slightly natural, the decoder/player adds the noise back
Anyone else finding this a bit...insane?