Oolite on HDR Displays

News and discussion of the PC port of Oolite.

Moderators: winston, another_commander

Simba
Above Average
Above Average
Posts: 18
Joined: Fri Aug 30, 2024 7:58 pm

Re: Oolite on HDR Displays

Post by Simba »

Image

I think the approach to implementing HDR is wrong. For a visual demonstration, I have attached a visualization of color spaces on the RGB color model. It shows that, as a rule, narrower spaces are part of wider ones. That is, when honestly converting, for example, a bitmap image from sRGB to DCI-P3, the colors should not become more saturated. On the contrary, they should remain in the same positions inside the sRGB triangle, but taking into account the output on the DCI-P3 display, you should not see any difference. In the context of a game, this means that by default, any object with sRGB textures should look the same in wider color spaces.
Then why do we need a wider color space? First, you can use textures originally made for DCI-P3. Secondly, certain visual effects can strongly distort colors (for example, a different spectrum of a Star, effects from shields, when hit by lasers, etc.).
I don't know exactly how the renderer works, but I think it first calculates the colors in the coordinates of the color model, and only before output converts it to an 8\10\12 bit format for the monitor in accordance with the target color space, or even focusing on the coverage information from EDID.
another_commander
Quite Grand Sub-Admiral
Quite Grand Sub-Admiral
Posts: 6809
Joined: Wed Feb 28, 2007 7:54 am

Re: Oolite on HDR Displays

Post by another_commander »

Simba wrote: Fri May 16, 2025 8:35 pm
Image.
Then why do we need a wider color space? First, you can use textures originally made for DCI-P3. Secondly, certain visual effects can strongly distort colors (for example, a different spectrum of a Star, effects from shields, when hit by lasers, etc.).
What you are describing is what happens with Oolite. We don't use dci-p3 textures of course - games that use such textures can be counted with one hand, but under default color saturation conditions, colors in Oolite are in the srgb color space and only effects such as lasers or hyperspace jumps etc. will cause some of them to go into dci-p3 or bt2020 territory. These extended color spaces are normally entered in Oolite only as a result of lighting shader maths. The game, as far as I can tell on my computer, looks the same between sdr and hdr saturation-wise and I have comparison shots between sdr and hdr to show for it. Now, if you go ahead and raise the in-game color saturation using the SetColorSaturation(0.0 .. 2.0) console command, then more colors will start becoming so saturated that srgb cannot contain them anymore and they start spilling into dci-p3 and bt-2020. You can test that by running Lilium's Reshade hdr analysis shaders on the game or by screenshooting and viewing the results in SKIV. If you want I can show you my results because I have already tested this.
I don't know exactly how the renderer works, but I think it first calculates the colors in the coordinates of the color model, and only before output converts it to an 8\10\12 bit format for the monitor in accordance with the target color space, or even focusing on the coverage information from EDID.
Actually it delivers an RGBA16F scRGB linear image to the graphics driver, which internally performs the conversion to 10-bit BT-2020 color space for display output on an hdr screen. No color model coordinates calculations occur anywhere, because scRGB and sRGB use the same color coordinates (color primaries, to be more precise), with scRGB just being bigger and accepting negative colors (which, btw, are the colors that correspond to dci-p3 and bt-2020). Most games do either this or render PQ 10-bit RGB10A2 output directly, with color-space conversions as required, the latter being the most common case.

Also, Oolite deliberately avoids referencing the EDID. Experience shows that a lot of times times it is completely off and contains bogus info, so best not use it until we start seeing accurate information becoming the rule.
another_commander
Quite Grand Sub-Admiral
Quite Grand Sub-Admiral
Posts: 6809
Joined: Wed Feb 28, 2007 7:54 am

Re: Oolite on HDR Displays

Post by another_commander »

OK, now you've made me go back and have a good look at it again. So here is what is happening with color saturation and HDR vs SDR in Oolite.

I downloaded a screen test pattern image to use as basis for color accuracy tests. This will be our reference test image:

Image


I then run a mission screen with this image as background in SDR. This is the result:
Image

And as you can see, the accuracy of the image reproduction in SDR is not great. There are noticeable differences and all are justifiable and due to the ACES tone mapper we use, but maybe also due to some srgb/gamma 2.2 mismatch, which is beyond the scope of this test to explain here. So anyway, we know that the SDR image is lower saturation than reference and is a bit darker too.

Next, with the current Oolite's HDR saturation settings, I've run the same mission screen in HDR. The result, which should be viewed in HDR and on a site that can interpret HDR pngs like https://upscale-compare.lebaux.co/ or an HDR PNG capable viewer like SKIV, has been uploaded here: https://drive.google.com/file/d/1m_AMnJ ... sp=sharing
You will find that the result is much, much closer to the original reference image. There are some slight differences in brightness but the color saturation is quite spot on. However, because the SDR in-game image is not very accurate, this HDR test image here looks like its off when you compare it to SDR, when in reality it is the more correct one.

So what can we do about it? I am thinking that if we sacrifice some color saturation accuracy, then we could make the HDR image get closer to the SDR one (because at the end of the day what we have learned to see in the game are SDR tone mapped images) and to achieve this we can adjust a bit the SATURATION_ADJUSTMENT constant in the oolite-fnal-hdr.fragment shader from its current (and more correct) 0.825 value to 0.725. This constant is used to make adjustments to color saturation in HDR to compensate for the different tone mappers used by the SDR and HDR shaders, which are both ACES, but different implementations.

With SATURATION_ADJUSTMENT lowered just a little bit to 0.725, the result is now this: https://drive.google.com/file/d/1Pe8Xef ... sp=sharing

This is closer to the SDR image generated in game - still not quite the same though. However, this is when we compare the test images. In-game, the differences in rendered scenes are practically absent and I think that maybe the new 0.725 value should be the new default. Effects like lasers and hyperjumps still spill into DCI-P3/BT-2020 which is good in my book.

Simba, please have a look and let me know your opinion. I'd be interested to hear your thoughts.
Simba
Above Average
Above Average
Posts: 18
Joined: Fri Aug 30, 2024 7:58 pm

Re: Oolite on HDR Displays

Post by Simba »

Thanks, now I get the point.
Then the question arises: is it not possible to compensate for low saturation in SDR? Then sdr and hdr output images will be as close as possible to the original srgb image.
Post Reply