Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

default uniform block layout packing not work in HLSL. #3475

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

mrcino
Copy link

@mrcino mrcino commented Jan 11, 2024

I'm using glslang to convert hlsl code to spir-v, but I found that some of the ubo members have incorrect offsets. I read part of the source code and found that ubo should use std140 layout by default, but it doesn't seem to take effect.

I made some modifications based on my understanding, and the rendering results alone are correct.

@CLAassistant
Copy link

CLAassistant commented Jan 11, 2024

CLA assistant check
All committers have signed the CLA.

@mrcino
Copy link
Author

mrcino commented Jan 11, 2024

As additional information:
I have a UBO which has the following members in hlsl:

{
   float4x4 _1;    // offset 0
   float4 _2;       // offset 64
   float _3;         // offset 80
   float2 _4;       // <====
}

In the std140 layout, member _4 should be aligned in base 8 bytes, so the offset should be 88; but the rendering result is incorrect;

After debugging with nsight, I found that this ubo was marked as scalar alignment, and the real offset is 84;

@mrcino
Copy link
Author

mrcino commented Jan 25, 2024

Sorry…I think I messed something up…

I looked at a few failed test results and it seems like the test cases themselves need to be updated?

Please confirm, thank you

@arcady-lunarg
Copy link
Contributor

I think you need to see whether the changes caused to the tests are reasonable and if so, update them. Are you sure your code is correct though? It seems like the merge was written that way for a specific reason, though I would have to dig into it a bit more to tell you whether it really works right.

@mrcino
Copy link
Author

mrcino commented Feb 3, 2024

I think you need to see whether the changes caused to the tests are reasonable and if so, update them. Are you sure your code is correct though? It seems like the merge was written that way for a specific reason, though I would have to dig into it a bit more to tell you whether it really works right.

Thank you for your comment. In fact, I also feel that this change is a bit simple and crude. I will provide my process of troubleshooting this issue and hope to receive further advice from you.

I have noticed the purpose of this merge operation: it is specifically mentioned in the following code:
image
My initial assumption was probably incorrect...

When I found that the alignment of UBO did not match my expectations, I noticed that this situation occurred in the function
'HlslParseContext::fixBlockUniformOffsets' responsible for adjusting the offset of UBO members:

image

This causes alignment not to follow std140, and then I noticed the code here(in HlslParseContext::constructAggregate):

image

The direction of the merge here is a bit strange, because the default settings were not updated to type. getQualifier() before fixBlockUniformOffsets

The direction of the merge here is a bit strange, unless UBO defaults to scalar alignment when translating from hlsl to spir-v; If that's the case, my PR is indeed incorrect because I noticed that defaultQualification was indeed used later to update memberQualifier. If that's the case, I need to see if there are any methods for manually adjusting the layout;

carefully looked at it and I think the code sequence here is reversed:
image
I will make the necessary modifications before attempting again.

If you could provide a response to my question, I would greatly appreciate it.

@mrcino mrcino force-pushed the fix_default_layout_qualifiers branch from 6118821 to 2b9374b Compare February 3, 2024 02:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants