Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extended brightness range rendering #4239

Open
ccameron-chromium opened this issue Jul 19, 2023 · 13 comments · May be fixed by #4500
Open

Extended brightness range rendering #4239

ccameron-chromium opened this issue Jul 19, 2023 · 13 comments · May be fixed by #4500
Labels
api WebGPU API
Milestone

Comments

@ccameron-chromium
Copy link
Contributor

WebGPU's spec currently indicates:

During presentation, the chrominance of color values outside of the [0, 1] range is not to be clamped to that range; extended values may be used to display colors outside of the gamut defined by the canvas' color space’s primaries, when permitted by the configured format and the user’s display capabilities. This is in contrast with luminance, which is to be clamped to the maximum standard dynamic range luminance.

This is a very good default behavior, but some applications may want to use extended-range brightness.

The following proposal provides a mechanism for opting in to extended-range brightness.
https://github.com/ccameron-chromium/webgpu-hdr/blob/main/EXPLAINER.md

@kdashg
Copy link
Contributor

kdashg commented Jul 26, 2023

I think there is a core tension here between (what I'll call here) high-luma (brightness) and high-chroma (hue).

Say a user is try to output (on a sufficiently good display) color(rec2020 1.0 0.5 0.5). Well, they might try to trying to lean on a promise of unclamped chrominance and write rgb(1.19 0.446 0.533)1. Because we cap luminance, we can choose to interpret this as "redder than red", or "go to 119% of the distance in the direction of the red primary". If we also uncap luminance/brightness though, suddenly it's ambiguous whether srgb(red=1.19) goes out past srgb-primary-red by ~20% (roughly towards rec2020-primary-red), or represents 20% brighter (more photons) of (less-saturated-red) srgb-primary-red.

If you clamp chroma to the display colorspace and also leave luma uncapped, then an display with a gamut between srgb and rec2020 here would go out past the srgb gamut, and then "climb the gamut wall" and add brightness. For a display-p3 monitor/display, that same color above is now color(display-p3 1.1 0.5 0.55). Do we spill then into hdr for this 110% output for the display-p3-red-primary? This will give us not just three different colors, but three different brightnesses as well.

Footnotes

  1. https://colorjs.io/apps/convert/?color=color(rec2020%201.0%200.5%200.5)&precision=3

@kainino0x
Copy link
Contributor

kainino0x commented Jul 27, 2023

I'm not 100% sure about this, but my understanding is that there isn't actually an ambiguity there. For example,
color(rec2020-linear 1 0 0) = color(srgb-linear 1.66 -0.1 0).
The negative blue value counterbalances the brighter R value, so you still get a luminance of 1.0.

To look at the color space visually we can look for example at the xY slice of the xyY color space - again, according to my understanding:

EDIT: this diagram is not quite right

image

@ccameron-chromium
Copy link
Contributor Author

While I think that the "luminance vs chrominance clamping" idea is useful conceptually, it does fall apart in terms of actual implementation. (Also, I'm going to live in srgb-linear, p3-linear, and rec2020-linear just to make the math easier here).

All monitors ultimately have 3 primaries (red, green, and blue) that have some maximum range. Historically (pre-HDR), we would think of these having a range of [0, 1], with (1,1,1) being white.

In an HDR world, we can think of these as all being in [0, H] for some HDR headroom H>=1. The value (1,1,1) is still white, and now the value (H,H,H) is the maximum producible brightness. (Deep in the guts of the display device, everything is still normalized to [0, 1], and it's actually the case that [1/H, 1/H, 1/H] is white ... but that's a very-not-fun space to work in, and often only exists ephemerally in a display controller's color management pipeline anyway).

The WebGPU spec currently says

During presentation, the chrominance of color values outside of the [0, 1] range is not to be clamped to that range; extended values may be used to display colors outside of the gamut defined by the canvas' color space’s primaries, when permitted by the configured format and the user’s display capabilities. This is in contrast with luminance, which is to be clamped to the maximum standard dynamic range luminance.

What we are clamping to is the "standard dynamic range" of the device. In practice, the "standard dynamic range" of the device is the color space with the device's primaries and with each color component being able to take values in the range [0, 1].

This per-component clamping is the same behavior that we have for handling out-of-gamut colors when drawing to a canvas and when uploading to a texture (and there are a bunch of WPT tests for that). It is also the only behavior that ensures color matching between extended and non-extended modes. If one wants more fancy handling of out-of-range colors, then that's where more fancy tonemapping algorithms (and metadata, etc) come in.

Example:

Suppose you have a canvas that has the solid color color(rec2020 1.0 0.5 0.5).

Case 1: You have a very fancy HDR monitor with P3 primaries and an HDR headroom of H=2. You do not specify that you want extended range.

  • First, you convert to the color space of the display. Our fancy monitor has exactly P3 primaries, so this means we now have color(p3-linear 1.25 0.21 0.26).
  • Next, because you didn't specify extended range, this is clamped to the "standard dynamic range" of the device, which means clamping each component to [0, 1], so you get color(p3-linear 1.00 0.21 0.26).

Case 2: You have a very fancy HDR monitor with P3 primaries and an HDR headroom of H=2. You do specify that you want extended range.

  • First, like last time, you get color(p3-linear 1.25 0.21 0.26).
  • And that's it. You don't do anything else. This is the color you show.

Case 3: You have an sRGB monitor, no extended range.

  • First, convert to your monitor's (sRGB) primaries to get color(srgb-linear 1.49 0.17 0.25)
  • Next, clamp to the "standard dynamic range" of the device, so you get color(srgb-linear 1.00 0.17 0.25).

@kainino0x
Copy link
Contributor

While I think that the "luminance vs chrominance clamping" idea is useful conceptually, it does fall apart in terms of actual implementation.

What we are clamping to is the "standard dynamic range" of the device. In practice, the "standard dynamic range" of the device is the color space with the device's primaries and with each color component being able to take values in the range [0, 1].

This is good to know. I had a misunderstanding when I was writing this spec text (without really knowing how HDR works) and we should update it. I'll file a new issue as V1.0: #4263

@kainino0x
Copy link
Contributor

I should clarify - the chrominance-luminance diagram I drew above is not good as a clamping model. But I think it still correctly visualizes why there's no ambiguity between "redder red" and "brighter red".

@litherum
Copy link
Contributor

litherum commented Aug 16, 2023

A few disjointed thoughts:

Suppose you have a canvas that has the solid color color(rec2020 1.0 0.5 0.5). ... First, you convert to the color space of the display. ... Next, because you didn't specify extended range, this is clamped to the "standard dynamic range" of the device

All these color conversions are handled for us in the window server. The programming model is "applications give some colors to the window server in whatever color space they feel like; it's the window server's responsibility to show them well on the display." Of course, it's possible for content to deliver a color to the window server that is outside of the gamut (or headroom) of the display, in which case the window server will use a tonemapping algorithm to show those colors as well as it can. That tone mapping algorithm isn't a simple clamping operation like you've described here.

But that's fine, because the result of that operation isn't observable by the webpage. It's only observable to the user, via their eyeballs*. So, if the WebGPU spec says "clamp this and don't clamp that" there's no way it's enforceable. The requirement here isn't that some algorithm outputs correct numbers; the only requirement is that different representations of the same color end up visually identical to the user (aka "color-matching").

So I kind of don't really understand why much of the above discussion needs to happen. All that's necessary is a single flag that says "stop clipping when you throw the colors over the wall to the system compositor; act as if the colors in my canvas have an infinite gamut with infinite headroom; the compositor will do something complicated and arbitrary with them, and that's okay."


(*And potentially observable via reftests, depending on your testing infrastructure - but not necessarily; it's totally possible to set up your testing infrastructure such that the page snapshots it generates are in infinite-gamut-infinite-brightness XYZ color space. Hell, every pixel in your snapshot could be an arbitrary precision mpfr float.)

(Also, much of what I say here is specific to the Mac. iOS works slightly differently, but not in ways that are material to the discussion.)

@kainino0x
Copy link
Contributor

(Not a reply to Myles)

I talked with Chris today and figured out that my diagram is not accurate, so I've hidden it. The conclusion is correct but the diagram doesn't provide the right intuition, in particular because 1.0 red doesn't have a luminance of 1.0. We'll have a better diagram soon.

@kainino0x
Copy link
Contributor

@kdashg
Copy link
Contributor

kdashg commented Aug 30, 2023

GPU Web 2023-08-16 (Atlantic-timed)
  • MM: lot of discussion has been what gets clamped where in what colorspace in the display pipeline. For Apple, that's out of the browser - system compositor does it. Probably true for most browsers. If compositor decides it wants to change its tone mapping algorithm it can just do that. Nothing API-exposed about it. Since this is non-observable by web content, don't think we should discuss it. Limit the discussion to the single boolean, "limit to SDR or not".
  • KG: don't think we can discuss the API change without discussing the broader topic, including color matching with WebGL and other things in the page.
  • CC: imp't that everyone agreeing to the change agrees in large part about what should happen. But color matching should be prescribed where it can be. Have an example, primaries of monitor are deep, and content was this - this is what you should expect to see. Assume non-normative, informative, maybe not even in the spec. Everyone should feel comfortable with what's going on there. Yes to both of you. Specifics, though, sit outside the spec.
  • MM: we won't normatively spec how tone mapping works - done by other libraries. To understand the display pipeline, need to discuss how it works, and that's OK. But don't want to say this color is clamped to this value in this colorspace.
  • CC: nice if we could have - this will color match this other HDR thing. Few examples where that's possible. You have enough headroom to display HDR image and you write the same pixels into a canvas - think they should display the same.
  • KN: concern about deferring everything to display compositor is: we do have to do something with content which pulls values that are HDR, but we don't want to display as HDR. Some definition of what we do in that case. If we say HDR content is SDR, isn't it undefined?
  • CC: it should color match with other SDR content. If luminance is greater -...
  • KN: if we have a float16 buffer containing (10, 0, 0), and HDR's off - what do you see? And can we guarantee how that color matches with other SDR content?
  • CC: no, think anything outside SDR range won't color match with SDR.
  • MM: agree with CC. Think that out of range content should be transformed however it needs to be.
  • KN: sounds fine. We already have premultipliedAlpha where if A < (R, G, B) then the color's illegal.
  • RC: on Windows the OS compositor never tonemaps for you. They think tonemapping's an art and the app needs to be responsible for it. Also it's common for some first-party apps, they have some WebGL thing with one colorspace/gamma, and 2D canvas on top, then HTML on top. Don't know what we should say in the spec, but that should show up and look good in all browsers.
  • CC: maybe a language difference between MM and RC; this mode's "extended", if the monitor can display it, it will. No guarantees if it's outside the monitor's caps. Tone mapping is clamping the value into a reasonable range for the monitor.
  • MM: basically agree
  • KG: I commented on the issue a few weeks ago - my comment still stands. Ambiguity between - trying to mix HDR and pushing outside the nominal gamut of colorspace. E.g. brighter vs. deeper red.
  • CC: with this ability, there should be a safe range you're known to be in. Is dynamic range standard or hi? Maybe you can query some quantized version of what the monitor can represent. Then you have a safe zone you can work in. And we have a media query for gamut.
  • KG: specific concern: display larger than either of two color spaces you're trying to match. E.g. sRGB canvas and P3 canvas, on monitor that's ??? P3. 10x / 2X HDR. Take those two things, try to make "P3 red" in sRGB - get >1 numbers. When you display that color - unclear if that …
  • CC: think I understand the Q. If you specify…
  • KN: same discussion as on Github issue.
  • KG: my example still stands.
  • KN: think no ambiguity.
  • CC: if I'm HDR buffer, I specify sRGB, and have some values lining up with P3 red - I think that should always match P3 red regardless of what your monitor can display. Regardless of how things are clamped. Because that's an SDR color- it behaves the same as all SDR colors, even if you requested extended range. Extended range won't get you that extra gamut unless you already had it.
  • KN: reason for non-ambiguity - the negative components.
  • KG: you can choose colors that don't have negative components.
  • CC: don't think it's obvious, but - if you have values >1 in sRGB, e.g. (1.001, 0, 0), it's still in the P3 gamut - still an SDR color.
  • KN: color.io converter breaks down in some cases.
  • CW: in interest of time, move on to Compat? Or wrap up with next steps?
  • KG: next step, me, Kai and Chris, and anyone else interested, should discuss.
  • MM: would like to be invited.
  • KG will set up this meeting.

@ccameron-chromium
Copy link
Contributor Author

I've put together this page which might help visualize the exact behavior:
https://ccameron-chromium.github.io/webgl-examples/gamut.html

Hopefully I can give a walk-though of this at an upcoming meeting.

@louderspace
Copy link

As an employee at Adobe, we are developing an upcoming product that will provide web-hosted video editing and compositing features. Modern video supports a wide gamut of colors and HDR is supported in almost all modern televisions and computer screens. Phones and DSLR camera are capable of capturing HDR content and users that produce video content require the ability to preview and edit those video with the color fidelity intact. Within our WebCodec and WebGPU dependencies, we expect HDR pixels to flow through our processing pipeline. As we build web-hosted video applications, it is imperative that the user can visualize these colors, especially when performing color correction.

I can preview stunningly beautiful HDR videos on youtube on my macbook pro in both Chrome and Safari : https://www.youtube.com/watch?v=tO01J-M3g0U

From Apple: Devices that combine wide-gamut color with 10-bit HDR capability can produce vibrant hues with nuanced shading, lending more realism and immediacy to an image, revealing more detail, and reducing artifacts like gradient “banding” (often seen in images of the sky as it transitions from lighter to darker areas.)

@ounterecker
Copy link

Speaking as a lead at Adobe, HDR support in WebGPU is going to be crucial for Photoshop on the web and next-gen web apps. Unclipped sRGB linear works great for us and we're in full support of the proposed HDR APIs.

This will help us close the gap to native apps in color, which is highly desired.

@adixon-adobe
Copy link

I'm a lead on Lightroom Web at Adobe. What @ounterecker mentioned for Photoshop is also true for Lightroom. We need exactly the same support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api WebGPU API
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants