Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Content delivery costs a lot for streaming services. After content is produced, this is basically the only remaining cost. It’s not surprising that they would go to extreme measures in reducing bitrate.

That’s why, presumably, Netflix came up with the algorithm for removing camera grain and adding synthetically generated noise on the client[0], and why YouTube shorts were recently in the news for using extreme denoising[1]. Noise is random and therefore difficult to compress while preserving its pleasing appearance, so they really like the idea of serving everything denoised as much as possible. (The catch, of course, is that removing noise from live camera footage generally implies compromising the very fine details captured by the camera as a side effect.)

[0] https://news.ycombinator.com/item?id=44456779

[1] https://news.ycombinator.com/item?id=45022184



So:

1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic" 3. to compress better, streaming services try to remove the noise 4. to hide the insane compression and make it look even slightly natural, the decoder/player adds the noise back

Anyone else finding this a bit...insane?


> camera manufacturers and film crews both do their best to produce a noise-free image

This is not correct, camera manufacturers and filmakers engineer _aesthetically pleasing_ noise (randomized grains appear smoother to the human eye than clean uniform pixels). The rest is still as silly as it sounds.


Considering how much many camera brands boast their super low noise sensors, I'd still say a very common goal is to have as little noise as possible and then let the director/dop/colorist add grain to their liking. Even something like ARRI's in-camera switchable grain profiles requires a low-noise sensor to begin with.

But yes, there are definitely also many DPs that like their grain baked-in and camera companies that design cameras for that kind of use.


In any case, luma noise is not at all a massive issue, and it is a mistake to say that crews do their best to produce a noise-free image. They do their best to produce an image that they want to see, and some amount of luma noise is not a deal-breaker. There are routinely higher priorities that will take over using the lowest ISO possible. They can also be financial considerations, if you don’t have enough lights.

It is only an issue in content delivery.


> randomized grains appear smoother to the human eye than clean uniform pixels

Does this explain why i dislike 4K content on a 4K TV? Where some series and movies look too realistic, what in turn gives me a amateur film feeling (like somebody made a movie with a smartphone).


This sounds like what is called the soap opera effect:

https://en.wikipedia.org/wiki/Soap_opera_effect

Which is generally associated with excess denoisong rather than with excess grain.


Are you sure? That seems to be about motion interpolation. Not even about smooth motion, but about it being interpolated. Here the concern seems to be about how the individual, still images look, not anything about the motion between them.


> some series and movies look too realistic, what in turn gives me a amateur film feeling

This comment that I replied to is almost a textbook description of the soap opera effect.

The interpolation adds more FPS, which is traditionally a marker of film vs TV production.


Avatar 2 was particularly egregious with the poor interpolation


i think i've seen this effect on tv shot on cameras that were lower fps than the tv outputs. looks fake and bad and interpolated because it is.


Yes, just stop doing step 2 the way they're doing and instead if they _must_ do noise modify parameters for step 4 directly.


> 1. camera manufacturers and film crews both do their best to produce a noise-free image 2. in post-production, they add fake noise to the image so it looks more "cinematic"

This is patently wrong. The rest builds up on this false premise.


1.1: Google some low-light performance reviews of cinema cameras - you'll see that ISO noise is decreasing with every generation and that some cameras (like from Sony) have that as a selling feature.

1.2.: Google "how to shoot a night scene" or something like that. You'll find most advice goes something along the lines of "don't crank the ISO up, add artificial lighting to brighten shadows instead". When given a choice, you'll also find cinematographers use cameras with particularly good low-light performance for dark scenes (that's why dark scenes in Planet Earth were shot on the Sony A7 - despite the "unprofessional" form factor, it had simply the best high-ISO performance at the time)

2: Google "film grain effect". You'll find a bunch of colorists explaining why film grain is different from ISO noise and why and how you should add it artificially to your films.


I am reasonably confident that I know this subject area better than one could learn from three Google searches. “ISO noise” is not even a real term; you are talking about digital sensor noise, which can further be luminance noise or colour noise. Opinions about luma noise being “unaesthetic” are highly subjective, depending a lot on stylistic choices and specific sensor, and some would say luma noise on professional sensors has not been ugly for more than a decade. Generally, the main reason I don’t think your comments should be taken seriously is because you are basically talking only about a subset of photography, digital photography, while failing to acknowledge that (works are still shot on film in 2025 and will be in years to come).


"ISO noise" is a thing I've heard on a few different film sets and it's a convenient shorthand for "the noise that becomes really apparent only when you crank up the gain". Now that I'm thinking about it, there's a good chance this is more of a thing where I'm from, since our native language doesn't have different words for grain and noise, so we have to diffentiate between them with a suffix, which we then incorrectly also use in English. I guess on a film set with mostly native English speakers, noise and grain is clear enough.

Next, no shit aesthetics are subjective, I never said this is the one objective truth. I said this is a thing that many people believe, as evidenced by the plethora of resources talking about the difference between noise and grain, why tasteful grain is better than a completely clean image and how to add it in post.

And finally, come on, it's obvious to everyone in this thread that I'm referring to digital, which is also not "just a subset" it's by far the biggest subset.

So idk what your point is. Most things are shot digitally. Most camera companies try to reduce sensor noise. Most camera departments try to stick to the optimal ISO for their sensor, both for dynamic range and for noise reasons, adjusting exposure with other means. In my experience, most people don't like the look of sensor/gain/iso/whatever noise. Many cinematographers and directors like the look of film grain and so they often ask for it to be added in post.

Besides the many/most/some qualifiers possibly not matching with how you percieve this (which is normal, we're all different, watch different content, work in different circles...), where exactly am I wrong?


I can assure you that “ISO noise” is not a real term. I would take the word of whoever uses it with a grain of salt, movie set or not. Words have meanings.

> it's obvious to everyone in this thread that I'm referring to digital

It was blindingly obvious that you meant digital. That’s why I pointed this out. Without mentioning that it is only a concern with digital photography, your points become factually incorrect on more than one level; because the thread wasn’t talking specifically about digital photography, some of your points about noise don’t apply even if they were correct—which they aren’t, by your own admission that photography is subjective. Producing a noise-free image is not the highest priority for film crews (for camera manufacturers it is, but that’s because it means more flexibility in different conditions; it does not mean film crews will always prioritize whatever settings give them the lowest noise, there are plenty of higher priorities), and in some cases they choose to produce image with some noise despite the capability to avoid it.

Sorry, with your googling suggestions it just reads like a newbie’s take on subject matter.


[flagged]


No need to put words in my mouth. First, the points were made in context of photography as a whole, and in that context, without specifying that they only apply in digital, they are false. Second, even in digital they are false. That’s all.


This isn't strictly Netflix per se, it's part of the AV1 codec itself, e.g. https://github.com/BlueSwordM/SVT-AV1/blob/master/Docs/Appen...


Yes, I was not strictly correct, it is a feature of AV1, but Netflix played an active role in its development, in rolling out the first implementation, and in AV1 codec development overall.


Prior codecs all had film grain synthesis too (at least back to H264), but nobody used it. Partly because it was obviously artificial and partly because it was too expensive to render, since you had to copy each frame to apply effects to it.


AFAIK, FGS support is the exception instead of the rule.

h.264 only had FGS as an afterthought, introduced years after the spec was ratified. No wonder it wasn’t widely adopted.

VP9, h.265 and h.266 don’t have FGS.


Did they remove it from the specs? Wouldn't surprise me. But since it's just extra metadata and doesn't affect encoding you could still use the old spec.


I don’t think it was ever part of the spec. It would surprise me if it was, because once a spec has been finalized and voted on, changing it is complicated. Getting agreement the first time is already difficult enough.


It feels to me like there are two different things going on:

1. Video codecs like the denoise, compress, synthetic grain approach because their purpose is to get the perceptually-closest video to the original with a given number of bits. I think we should be happy to spend the bits on more perceptually useful information. Certainly I am happy with this.

2. Streaming services want to send as few bytes as they can get away with. So improvements like #1 tend to be spent on decreasing bytes while holding perceived quality constant rather than increasing perceived quality while holding bitrate constant.

I think one should focus on #2 and not be distracted by #1 which I think is largely orthogonal.


For #1 the problem with keeping grain in the compressed video is that it doesn't follow the motion of the scene so it makes it much more expensive to code future frames.


I disagree, because 1) complete denoising is simply impossible while preserving fine detail and 2) noise is a serious artistic choice—just like anamorphic flare, lens FOV with any distortion artifacts, chromatic aberration, etc. Even if it is synthetic film grain that is added in post, that has been somebody’s artful decision; removing it and simulating noise on the client butchers the work.


>Content delivery costs a lot for streaming services.

The hard disk space to store an episode of a show is $0.01. With peering agreements, the bandwidth of sending the show to a user is free.


>With peering agreements bandwidth of sending the show to a user is free.

I'm not sure why you think this, but it's one of the oddest things I've seen today.

The more streams you can send from a single server the lower your costs are.


Sure, but buying a server is not buying bandwidth. The point of my post is to counter the narrative that streaming video is very expensive.


There might be also copyright owners requirements, e.g. contract that limits the quality of material.


I recall Netflix saying that streaming cost was nothing compared to all other costs.


I am curious to see their breakdown. It seems very counter-intuitive of them to invest so much into reducing bitrate if the cost of delivery is negligible. R&D and codec design efforts cost money and running more optimized codecs and aggressive denoising cost compute.


Improved compression also saved on storage costs. We would have to hear them state the same about storage.


It probably is but the bean counters do not want to hear this, they want to cut everything to the point that it's just above the limit that consumers will accept before they throw in the towel and cancel their membership (ads, low quality compression, etc)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: