Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: query GL_MAX_SAMPLES before setting multisampling hint #8087

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

lodicolo
Copy link
Contributor

  • adds better located exceptions to catch things like "couldn't find matching glx visual"
  • creates and disposes of temporary gl-enabled window and gl context to grab GL_MAX_SAMPLES from

This specifically fixes cases where MultiSampleCount is set to something like 16 on a build for Desktop GL and then is run on weaker hardware like a Steam Deck where GL_MAX_SAMPLES is 8.

Games do not have an API to query for this information to set a valid value for MultiSampleCount so the error is effectively unrecoverable and extremely hard to diagnose.

The exceptions currently thrown are only symptoms of the issue, they don't expose the actual failure which in this case needed to be queried for via SDL_GetError().

- adds better located exceptions to catch things like "couldn't find matching glx visual"
- creates and disposes of temporary gl-enabled window and gl context to grab GL_MAX_SAMPLES from
@mrhelmut
Copy link
Contributor

You should be able to query GraphicsCapabilities.MaxMultiSampleCount in Game.Initialize() and apply graphics device parameters from there. Does that work?

DesktopGL comes with some limitation compared to WindowsDX in the fact that some graphics capabilities can be queried only after the Game constructor due to how SDL operates things under the hood. I'm a bit on the fence to create a fake window just to query that, I feel like it might introduce more issues in the long run that fixing this use case.

@lodicolo
Copy link
Contributor Author

GraphicsCapabilities and GraphicsCapabilities.MaxMultiSampleCount are internal so it would require some code changes to expose this.

It also has to create a temporary window no matter what, because GL can't query the API without an active context, and the way MonoGame is consuming SDL and GL it requires creating an SDL window.

I think creating the full graphics device chain in managed memory is creating a lot more garbage and introducing more opportunities for bugs instead of just quickly creating and destroying an SDL window in a controlled manner and then creating the real window. This is also, to my understanding, standard practice when using SDL (and GL).

I also personally have a concern with the approach of exposing this to the end users and expecting them to deal with it because:

  • MaxSamples value is not always reliable, it reports 32 on my desktop when in reality it only supports up to 16 -- my Steam Deck reports correctly. Ordinarily I would be against my own fix for this, but the PR is meant as a "best effort" to prevent a crash, and includes better messaging so the end-user doesn't have to dive into dependencies with a debugger to figure out the root cause. I think having a potentially incorrect value as the maximum acceptable for a best effort, but not acceptable for a public API.
  • Seems weird to expect users to know to poll graphics capabilities and then set the value
  • Correct me if I am wrong, but in DX this can be done in the constructor, so this is also a GL-specific issue. I think it goes against the spirit of MonoGame to expect users to understand the internals of MonoGame and its dependencies in order to properly fix this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants