> Why? If you're spending a significant chunk of your bits just transmitting data that could be effectively recreated on the client for free, isn't that wasteful? Sure, maybe the grains wouldn't be at the exact same coordinates, but it's not like the director purposefully placed each grain in the first place.
I'm sorry your tech isn't good enough to recreate the original. That does not mean you get to change the original because your tech isn't up to the task. Update your task to better handle the original. That's like saying an image of the Starry Night doesn't retain the details, so we're going to smear the original to fit the tech better. No. Go fix the tech. And no, this is not fixing the tech. It is a band-aid to cover the flaws in the tech.
> I'm sorry your tech isn't good enough to recreate the original. That does not mean you get to change the original because your tech isn't up to the task.
The market has spoken and it says that people want to watch movies even when they don't have access to a 35mm projector or a projector than can handle digital cinema packages, so nobody is seeing the original outside a theater.
Many viewers are bandwidth limited, so there's tradeoffs ... if this film grain stuff improves available picture quality at a given bandwidth, that's a win. IMHO, Netflix blogs about codec things seem to focus on bandwidth reduction, so I'm never sure if users with ample bandwidth end up getting less quality or not; that's a valid question to ask.
The differences are actual film grain vs some atrocious RGB noise artificially added by the streamer. How is that unclear? What else could we be talking about?
In theory though, I don't see any reason why client-side grain that looks identical to the real thing shouldn't be achievable, with massive bandwidth savings in the process.
It won't be, like, pixel-for-pixel identical, but that was why I said no director is placing individual grain specks anyway.
Let's be clear. The alternative isn't "higher bandwidth" it's "aggressive denoising during stream encode". If the studio is adding grain in post then describing that as a set of parameters will result in a higher quality experience for the vast majority of those viewing it in this day and age.
If the original is an actual production shot on film, the film grain is naturally part of it. Removing it never looks good. If it is something shot on a digital camera and had grain added in post, then you can go back to before the grain was added and then do it client side without degradation. But you can never have identical when it originated on film. That's like saying you can take someone's freckles away and put them back in post just rearranged and call it the same.
I'm sorry your tech isn't good enough to recreate the original. That does not mean you get to change the original because your tech isn't up to the task. Update your task to better handle the original. That's like saying an image of the Starry Night doesn't retain the details, so we're going to smear the original to fit the tech better. No. Go fix the tech. And no, this is not fixing the tech. It is a band-aid to cover the flaws in the tech.