Page 5 of 6

Re: Oolite on HDR Displays

Posted: Sat Aug 24, 2024 9:56 pm
by Cholmondely
another_commander wrote: Mon Oct 23, 2023 5:11 am
I guess both the readme distributed with the game and the wiki are good places to have this information available.
How would you suggest adding it to the wiki?

A "HDR" page. Or a displays page?

Image

Re: Oolite on HDR Displays

Posted: Sun Aug 25, 2024 7:21 am
by another_commander
A page entitled HDR would be fine I guess

Re: Oolite on HDR Displays

Posted: Sun Aug 25, 2024 2:56 pm
by Cholmondely
another_commander wrote: Sun Aug 25, 2024 7:21 am
A page entitled HDR would be fine I guess
Here's a feeble atttempt at something I'm ignorant of. I'm presuming that a screenshot is pointless. Please edit it to death!

Re: Oolite on HDR Displays

Posted: Mon Aug 26, 2024 6:15 am
by another_commander
[EliteWiki] HDR wiki page is now up.

Re: Oolite on HDR Displays

Posted: Wed Aug 28, 2024 7:54 am
by another_commander
I have implemented support for saving HDR snapshots in EXR format. This is much better than Radiance HDR; it supports expanded gamut colors, for starters. It is also more widespread in the graphics industry, meaning that there are more apps available that can open it out of the box. Although slower to encode than Radiance, in practice I did not see a measurable difference when saving an HDR screenshot. The only drawback compared to Radiance that I can see is size, as EXRs are massive (11.8MB for a 1920x1080 screen vs 4.5MB with Radiance). But at the very least, we now have the option to choose our format.

Right now choosing EXR or Radiance happens at compile time, but probably this could be turned into a .GNUstepDefaults option in the future. We have test binaries available on github ( https://github.com/OoliteProject/oolite ... 0592041935 ) if you want to have a go at it. The code is at the EXRSnapshots branch.

The implementation uses the tinyexr (written in C++) and miniz libraries and has been designed to accommodate Linux in the future. If/when we have HDR support on Linux, adding the same feature there should be just a trivial manipulation of the GNUmakefile. As it stands right now, Linux binaries should see no change other than an info message in stderr that the feature is pending implementation, in case anyone tries to call the SaveEXRSnapshot function in the code.

Re: Oolite on HDR Displays

Posted: Thu Aug 29, 2024 10:26 am
by another_commander
HDR snapshot file format is now a user preference in the EXRSnapshots branch. Default is EXR, but you can set the .GNUstepDefaults key "hdr-snapshot-format" to make it save Radiance .hdr snapshots like this:

Code: Select all

"hdr-snapshot-format" = .hdr;
The leading dot is optional, Oolite will understand the extension with or without it. If the key is missing we default to .exr. If the user tries to be clever and enters any random string rather than the allowed "[.]hdr" and "[.]exr", a warning will be posted in the log and .exr will be selected by default anyway.

We are very close to pull request with this, I believe.

Re: Oolite on HDR Displays

Posted: Mon Sep 02, 2024 2:43 pm
by another_commander
As it turns out, the escessive size of EXRs I was complaining about earlier was due to me not having activated image compression during my tests, With just one little line of code, added in commit 181c4f3, EXR sizes are now very reasonable and directly comparable to Radiance .hdr ones. Depending on the image content, sizes of a 1920x1080 snapshot can range from about 3MB to 6.5MB instead of 12MB.

So defaulting to EXR as our snapshot format for HDR is definitely the way forward.

Re: Oolite on HDR Displays

Posted: Sun Sep 08, 2024 9:23 pm
by Simba
I would like to report a few bugs:

1) I tried to set the maximum brightness value to less than 400 nits (in my example it is 310 nits), having previously edited "descriptions.plist". But when I select my value, nothing happens, and after exiting the settings menu and returning back, it turns out that the value is set to 400 nits. In the end, I was able to set the desired value only by editing ".GNUstepDefaults".

2) For some reason, changing the paper white brightness distorts the colors of the original SDR content. An example with XenonUI in the screenshots (see the gradients near the block headers). The value without distortion is 200 nits.
SDRImage
200Image
170Image
120Image
100Image
80Image

If you increase the brightness, for some reason the background grid appears:
210Image
220Image
280Image

Re: Oolite on HDR Displays

Posted: Mon Sep 09, 2024 6:45 am
by another_commander
Thanks for the post, much appreciate the time taken to test and provide feedback.

Regarding point 1, I have been able to reproduce the issue and there is indeed a small problem when the user tries to manually add a max brightness value which is less than the minimum expected by the game, which is 400 nits. There are two ways to resolve this:
  • Let the player apply any value they set in descriptions.plist. This would be the more transparent way and probably what most players would expect to see. If you set your value at 310 nits, you expect to see that appearing in the max brightness slider.
  • Respect the limits set by the game. Currently, for the maximum brightness setting those are 400 nits for low and 1000 nits for high. Settng any max brightness to a higher or lower value should be rejected by the game and ignored.
I am leaning towards the second one. The problem with the first is that it would allow any value and that could lead to a completely messed up image, because there would be nothing stopping the player from e.g. setting a paper white value higher than the manually defined maximum brightness. Error checking these edge cases can be messy. So I think that I will change it so that your 310 nits value will be ignored and the game will stick to 400 even if you add 310 in descriptions.plist.

For anyone interested, by leaving the max brightness at 400 nits you actually get the game to tone map to around 320 nits. I know, it doesn't make sense, but 400 nits is the minimum VESA standard for HDR and my own laptop can output a max of around 350 nits, so I had to test with what I actually have. That's why currently it is set so that the 400 selection will actually tone map to whatever my laptop can achieve (which, coincidentally, is what you want Simba). In the future I can probably adjust it a bit so that max brightness 400 actually outputs 400 nits and maybe add 350 nits as an extra selection.


Regarding point 2, maybe I am missing what you mean, but what you describe is expected behavior. Paper white is just a setting that tells Oolite how many nits an object with rgb color (1.0, 1.0, 1.0) emits. So if you set paper white to 140 nits, rgb (1,1,1) will emit 140 nits and if you set it to 280 nits, it will emit 280 (leaving less space until 310 or 400 or whatever for highlights. If you set paper white to be equal to your maximum brightness, then all you are achieving is a very bright SDR image). So you can consider it more or less as a generic overall brightness slider. Changing its value will dim or make on-screen colors brighter (without exceeding max brightness). The background grid you mention is expected to appear at high paper white values; it is information already present in the background image and brightening said image up just makes the grid more easily visible. I don't think there is any problem there. If I am missing something, please get back to me with some more details.

Re: Oolite on HDR Displays

Posted: Sat Sep 14, 2024 9:15 pm
by Simba
Sorry for not responding right away. I forgot to mention in my message that I need to increase the gamma to see the difference better. I am attaching screenshots in which I will increase the exposure beforehand. In SDR the shades are very similar. However, in HDR the difference between shades seems to increase, and sometimes they blend into one.

SDR (I highlighted the area in red where to look)
Image

HDR with different paper white values
Image Image Image Image Image

Perhaps the distortions shown are the result of excessive saturation. And I also consider increased saturation a problem, because the original textures were not designed for such distortions. For example, Lave in HDR acquires too acidic colors.

SDR Image
HDR Image

If I understand correctly from the messages in the thread, then by increasing saturation you are trying to cover wider color spaces. It might make sense to add an additional setting for this.

Re: Oolite on HDR Displays

Posted: Sat Sep 14, 2024 10:13 pm
by another_commander
Which Lave OXP is this? I'd like to test it myself.

Also, just to confirm whether the gamut expansion is indeed responsible for the deep-fried colors or not, try editing the shader <OoliteInstallDir>/oolite.app/Resources/Shaders/oolite-final-hdr.fragment, line 747

Code: Select all

// gamut expansion here
result = expandGamut(result, uSaturation * 1.0f / pow(2.0f, uSaturation));
and comment out the result = ... line then run the game in hdr again.

You can also try setting the color saturation to a value lower than 1.0. The easiest way to do it is with the debug console and a test release build; execute the command setColorSaturation(0.75) or whatever value you prefer between 0.0 (black and white) and 2.0 (silly saturation, not recommended), with 1.0 being the default.

The gradient color differences in the gui screens is probably due to the different tone mappers used between SDR and HDR.

Re: Oolite on HDR Displays

Posted: Sat Sep 14, 2024 11:52 pm
by another_commander
OK, I've been able to reporduce deep-fried colors with the Mars textures. It looks like we are oversaturating by default and that the gamut expansion has no involvement in a lot to do with it, at least in the 400 nits max brightness case.

This is the default SDR image:
Image

This is HDR with the default saturation. It's too much:
Image

And this is the same in HDR with saturation down to 0.75:
Image

Looks like we may have to bring down the default saturation levels in HDR. Can you please try this on Lave to confirm it gets rid of the acidic colors? If you don't want to bother with the debug console, just stick this in a file called script.js and save it in your AddOns/Config folder:

Code: Select all

this.shipWillLaunchFromStation = function()
{
	setColorSaturation(0.75);
}
The new saturation will be applied as soon as you launch.

Re: Oolite on HDR Displays

Posted: Mon Sep 16, 2024 8:58 pm
by Simba
another_commander wrote: Sat Sep 14, 2024 11:52 pm

Code: Select all

this.shipWillLaunchFromStation = function()
{
	setColorSaturation(0.75);
}
Yes, it has become better. But the SDR->HDR conversion is still too pronounced. By the way, I was able to reproduce this using the AdvancedAutoHDR shader in ReShade: the input space is sRGB, and the output is scRGB. But the only way to keep the colors close to the original is to also specify scRGB as the input space. This suggests that there is no need to do any additional conversion inside the game.
another_commander wrote: Sat Sep 14, 2024 10:13 pm
Which Lave OXP is this? I'd like to test it myself.
Yes

Re: Oolite on HDR Displays

Posted: Tue Sep 17, 2024 7:00 am
by another_commander
Please test build 15279da and let us know if this would feel acceptable. I think it's the best compromise I could achieve between keeping the look of the game as close as possible to the SDR version, while still allowing highly saturated colors to enter extended color spaces. Keep in mind that HDR will always have to look somewhat different to SDR, simply because of the different tone mappers used for each flavor. They are both ACES, but the HDR one is an approximation, especially for the lower brightness settings.

The oolite-final-hdr.fragment shader has now two constants near the top which can be toyed with if you feel like testing further.

Code: Select all

#define SATURATION_ADJUSTMENT	0.825 //adjustment factor to better approach SDR saturation
#define GAMUT_EXPANSION_AMOUNT	0.2 //0.0f .. 1.0f - best results if this is kept low
The first is just an internal saturation multiplier. You can still change saturation externally from scripts using the 0.0 to 1.0 range without change, it's just that the value of e.g. 1.0 now corresponds to a lower saturation than before (it still reaches excessive values though. For me anything above 1.3 looks kind of extreme).
The second is the amount of desired gamut expansion. Normally if you change SATURATION_ADJUSTMENT you will want to modify this too a bit. During testing I found that increasing the saturation adjustment had to be compensated by a decrease of the expansion amount to avoid frying the result. But this is not a rule, just an observation.

It would be great if both of those were in-game options, but unfortunately we are out of space in the Game Options screen and I don't feel like redesigning the GUI right now. We'll have to stick to modifying shader source for experimentation.

Anyway, let me know what you think.

Re: Oolite on HDR Displays

Posted: Wed Sep 18, 2024 8:32 pm
by Simba
I found out the reason for the color distortion - the game did not work in true HDR mode. If you switch to windowed mode and change the value of the SDR content brightness slider in the windows settings, it will affect the image in the game window. Fortunately, I was able to force true HDR mode: I installed ReShade with the AutoHDR plugin.
another_commander wrote: Tue Sep 17, 2024 7:00 am
Please test build 15279da and let us know if this would feel acceptable.
Yes, the image looks good now
another_commander wrote: Mon Sep 09, 2024 6:45 am
Respect the limits set by the game. Currently, for the maximum brightness setting those are 400 nits for low and 1000 nits for high. Settng any max brightness to a higher or lower value should be rejected by the game and ignored.

I am leaning towards the second one. The problem with the first is that it would allow any value and that could lead to a completely messed up image, because there would be nothing stopping the player from e.g. setting a paper white value higher than the manually defined maximum brightness. Error checking these edge cases can be messy. So I think that I will change it so that your 310 nits value will be ignored and the game will stick to 400 even if you add 310 in descriptions.plist.
I think it would be better to allow players to set the values ​​they are interested in at their own risk by editing ".GNUstepDefaults". For example, this would allow more accurate values to be set even for vesa-certified monitors (actual values always differ from declared values). You can actually go further and implement reading LUT tables from files - I saw this in No Man`s Sky.