-
Notifications
You must be signed in to change notification settings - Fork 303
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extended brightness range rendering #4239
Comments
I think there is a core tension here between (what I'll call here) high-luma (brightness) and high-chroma (hue). Say a user is try to output (on a sufficiently good display) If you clamp chroma to the display colorspace and also leave luma uncapped, then an display with a gamut between srgb and rec2020 here would go out past the srgb gamut, and then "climb the gamut wall" and add brightness. For a display-p3 monitor/display, that same color above is now Footnotes |
I'm not 100% sure about this, but my understanding is that there isn't actually an ambiguity there. For example, To look at the color space visually we can look for example at the xY slice of the xyY color space - again, according to my understanding: |
While I think that the "luminance vs chrominance clamping" idea is useful conceptually, it does fall apart in terms of actual implementation. (Also, I'm going to live in All monitors ultimately have 3 primaries (red, green, and blue) that have some maximum range. Historically (pre-HDR), we would think of these having a range of In an HDR world, we can think of these as all being in The WebGPU spec currently says
What we are clamping to is the "standard dynamic range" of the device. In practice, the "standard dynamic range" of the device is the color space with the device's primaries and with each color component being able to take values in the range This per-component clamping is the same behavior that we have for handling out-of-gamut colors when drawing to a canvas and when uploading to a texture (and there are a bunch of WPT tests for that). It is also the only behavior that ensures color matching between extended and non-extended modes. If one wants more fancy handling of out-of-range colors, then that's where more fancy tonemapping algorithms (and metadata, etc) come in. Example: Suppose you have a canvas that has the solid color Case 1: You have a very fancy HDR monitor with P3 primaries and an HDR headroom of
Case 2: You have a very fancy HDR monitor with P3 primaries and an HDR headroom of
Case 3: You have an sRGB monitor, no extended range.
|
This is good to know. I had a misunderstanding when I was writing this spec text (without really knowing how HDR works) and we should update it. I'll file a new issue as V1.0: #4263 |
I should clarify - the chrominance-luminance diagram I drew above is not good as a clamping model. But I think it still correctly visualizes why there's no ambiguity between "redder red" and "brighter red". |
A few disjointed thoughts:
All these color conversions are handled for us in the window server. The programming model is "applications give some colors to the window server in whatever color space they feel like; it's the window server's responsibility to show them well on the display." Of course, it's possible for content to deliver a color to the window server that is outside of the gamut (or headroom) of the display, in which case the window server will use a tonemapping algorithm to show those colors as well as it can. That tone mapping algorithm isn't a simple clamping operation like you've described here. But that's fine, because the result of that operation isn't observable by the webpage. It's only observable to the user, via their eyeballs*. So, if the WebGPU spec says "clamp this and don't clamp that" there's no way it's enforceable. The requirement here isn't that some algorithm outputs correct numbers; the only requirement is that different representations of the same color end up visually identical to the user (aka "color-matching"). So I kind of don't really understand why much of the above discussion needs to happen. All that's necessary is a single flag that says "stop clipping when you throw the colors over the wall to the system compositor; act as if the colors in my canvas have an infinite gamut with infinite headroom; the compositor will do something complicated and arbitrary with them, and that's okay." (*And potentially observable via reftests, depending on your testing infrastructure - but not necessarily; it's totally possible to set up your testing infrastructure such that the page snapshots it generates are in infinite-gamut-infinite-brightness XYZ color space. Hell, every pixel in your snapshot could be an arbitrary precision mpfr float.) (Also, much of what I say here is specific to the Mac. iOS works slightly differently, but not in ways that are material to the discussion.) |
(Not a reply to Myles) I talked with Chris today and figured out that my diagram is not accurate, so I've hidden it. The conclusion is correct but the diagram doesn't provide the right intuition, in particular because 1.0 red doesn't have a luminance of 1.0. We'll have a better diagram soon. |
Meanwhile I found these diagrams useful: |
GPU Web 2023-08-16 (Atlantic-timed)
|
I've put together this page which might help visualize the exact behavior: Hopefully I can give a walk-though of this at an upcoming meeting. |
As an employee at Adobe, we are developing an upcoming product that will provide web-hosted video editing and compositing features. Modern video supports a wide gamut of colors and HDR is supported in almost all modern televisions and computer screens. Phones and DSLR camera are capable of capturing HDR content and users that produce video content require the ability to preview and edit those video with the color fidelity intact. Within our WebCodec and WebGPU dependencies, we expect HDR pixels to flow through our processing pipeline. As we build web-hosted video applications, it is imperative that the user can visualize these colors, especially when performing color correction. I can preview stunningly beautiful HDR videos on youtube on my macbook pro in both Chrome and Safari : https://www.youtube.com/watch?v=tO01J-M3g0U From Apple: Devices that combine wide-gamut color with 10-bit HDR capability can produce vibrant hues with nuanced shading, lending more realism and immediacy to an image, revealing more detail, and reducing artifacts like gradient “banding” (often seen in images of the sky as it transitions from lighter to darker areas.) |
Speaking as a lead at Adobe, HDR support in WebGPU is going to be crucial for Photoshop on the web and next-gen web apps. Unclipped sRGB linear works great for us and we're in full support of the proposed HDR APIs. This will help us close the gap to native apps in color, which is highly desired. |
I'm a lead on Lightroom Web at Adobe. What @ounterecker mentioned for Photoshop is also true for Lightroom. We need exactly the same support. |
WebGPU's spec currently indicates:
This is a very good default behavior, but some applications may want to use extended-range brightness.
The following proposal provides a mechanism for opting in to extended-range brightness.
https://github.com/ccameron-chromium/webgpu-hdr/blob/main/EXPLAINER.md
The text was updated successfully, but these errors were encountered: