Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(ext/webgpu): GPUDevice.createTexture works when the argument descriptor.size is an Array #23413

Merged
merged 20 commits into from May 6, 2024

Conversation

Hajime-san
Copy link
Contributor

fixes #22733

validation

BTW, Should we implement according to the spec?

The spec indicates GPUDevice.createTexture needs validations like validate GPUExtent3D shape(descriptor.size). and other.

Here is a minimal example of the validation example.

  • index.html
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
</head>
<body>
    <script>
        (async () => {
            const adapter = await navigator.gpu.requestAdapter();
            const device = await adapter.requestDevice();
            const texture = device.createTexture({
                label: "Capture",
                // this array length overflows the spec
                size: [256, 256, 1, 0],
                format: "rgba8unorm-srgb",
                usage: GPUTextureUsage.RENDER_ATTACHMENT | GPUTextureUsage.COPY_SRC,
            });
        })()
    </script>
</body>
</html>

It runs on Google Chrome(122.0.6261.112), then console shows the message as follows.

Uncaught (in promise) TypeError: Failed to execute 'createTexture' on 'GPUDevice': A sequence of number used as a GPUExtent3D must have between 1 and 3 elements.
    at index.html:12:36

And, same JavaScript code throws no error by deno 1.42.3.

@crowlKats
Copy link
Member

The spec indicates GPUDevice.createTexture needs validations like validate GPUExtent3D shape(descriptor.size). and other.

in this case, yes we should add validation in normalizeGPUExtent3D that the array length is correct, however we don't need to validate other things since wgpu will validate.

@crowlKats
Copy link
Member

I don't see how this patch would fix the issue, since you are just moving the normalization out of the object definition after the spread to be redefined on the object itself, this shouldnt change any behaviour

@Hajime-san
Copy link
Contributor Author

In this createTexture function, the argument of descriptor is used in op_webgpu_create_texture and createGPUTexture inside.
So I think it should assign normalized descriptor.size to the arguments of createGPUTexture due to it looks descriptor.size inside.

deno/ext/webgpu/01_webgpu.js

Lines 2249 to 2251 in ebc22d9

texture[_width] = descriptor.size.width;
texture[_height] = descriptor.size.height;
texture[_depthOrArrayLayers] = descriptor.size.depthOrArrayLayers;

Here is an example and result. My patch guarantees the width, height and depthOrArrayLayers is assigned.

const adapter = await navigator.gpu.requestAdapter();
const device = await adapter.requestDevice();
const texture = device.createTexture({
  label: "Capture",
  size: [256, 256],
  format: "rgba8unorm-srgb",
  usage: GPUTextureUsage.RENDER_ATTACHMENT | GPUTextureUsage.COPY_SRC,
});
console.log(texture)
deno --version                       
deno 1.42.3 (release, aarch64-apple-darwin)
v8 12.3.219.9
typescript 5.4.3
user@user deno % deno run --unstable-webgpu texture.ts             
GPUTexture {
  label: "Capture",
  width: undefined,
  height: undefined,
  depthOrArrayLayers: undefined,
  mipLevelCount: 1,
  sampleCount: 1,
  dimension: "2d",
  format: "rgba8unorm-srgb",
  usage: 17
}
user@user deno % target/debug/deno run --unstable-webgpu texture.ts
GPUTexture {
  label: "Capture",
  width: 256,
  height: 256,
  depthOrArrayLayers: 1,
  mipLevelCount: 1,
  sampleCount: 1,
  dimension: "2d",
  format: "rgba8unorm-srgb",
  usage: 17
}

@crowlKats
Copy link
Member

Oh i see, i misunderstood the patch, my bad

@Hajime-san
Copy link
Contributor Author

in this case, yes we should add validation in normalizeGPUExtent3D that the array length is correct, however we don't need to validate other things since wgpu will validate.

Thanks!
How we guarantees the validation is implemented correctly, is it in web platform tests?

@crowlKats
Copy link
Member

we don't currently run web platform tests for webgpu, so for now adding a test in webgpu_test.ts would be the best.

@crowlKats
Copy link
Member

also, could you revert the wpt submodule update?

@Hajime-san
Copy link
Contributor Author

Oh i see, i misunderstood the patch, my bad

Never mind! This behavior is hard to read a little, so I'll add a comment.

we don't currently run web platform tests for webgpu, so for now adding a test in webgpu_test.ts would be the best.

sure.

also, could you revert the wpt submodule update?

Oops. I'll fix it.

@Hajime-san
Copy link
Contributor Author

Hajime-san commented Apr 19, 2024

Hi @crowlKats, I added some commits! Please check it.

BTW, it seems that the title of this PR might be widen the context when I created🤔

@crowlKats
Copy link
Member

@Hajime-san apologies about the delay on this! I don't think the changes related to copyExternalImageToTexture should be added, could you please revert those?

Copy link
Member

@crowlKats crowlKats left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@crowlKats crowlKats merged commit a635abb into denoland:main May 6, 2024
17 checks passed
@Hajime-san Hajime-san deleted the fix-webgpu-device-createtexture branch May 6, 2024 14:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

WebGPU: texture width and height are undefined when descriptor size is an array.
2 participants