The differences are actual film grain vs some atrocious RGB noise artificially added by the streamer. How is that unclear? What else could we be talking about?
In theory though, I don't see any reason why client-side grain that looks identical to the real thing shouldn't be achievable, with massive bandwidth savings in the process.
It won't be, like, pixel-for-pixel identical, but that was why I said no director is placing individual grain specks anyway.
Let's be clear. The alternative isn't "higher bandwidth" it's "aggressive denoising during stream encode". If the studio is adding grain in post then describing that as a set of parameters will result in a higher quality experience for the vast majority of those viewing it in this day and age.
If the original is an actual production shot on film, the film grain is naturally part of it. Removing it never looks good. If it is something shot on a digital camera and had grain added in post, then you can go back to before the grain was added and then do it client side without degradation. But you can never have identical when it originated on film. That's like saying you can take someone's freckles away and put them back in post just rearranged and call it the same.