Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#954 Bevy render PoC #960

Conversation

tychedelia
Copy link
Collaborator

@tychedelia tychedelia commented Jan 26, 2024

Initial Render/Draw PoC

This pull request implements the port of our existing rendering pipeline for #954 as well as an initial port of the draw api for #955. The goal has been mostly fidelity to the original code, and while the draw API works, significant features provided by App and Frame are currently missing. I.e., you can issue draw commands but not much else.

General Description

At a high level, bevy_nannou_draw creates instances of Draw which exist in the main ECS game world. On every frame, that Draw is extracted by bevy_nannou_render into the render world and written into a ViewMesh component, which is eventually rendered by our NannouViewNode in the bevy render graph.

There's a bunch more fiddly details that I worked through here, but the goal was, in all places, to use the bevy equivalent, which means primarily we are using:

  • Bevy's view and windowing system.
  • Bevy's camera view projection for our shader.
  • Bevy's "intermediate" texture which provides the "sketch" like behavior and is written to the swap chain for us.
  • Bevy's depth texture and MSAA.
  • Bevy's render graph, which means post-processing effects should (?) impact our sketch.

A general outline of the bevy systems that wire this together:

  1. In the spawn_draw system, for each new Added<Camera>, we insert a Draw component on the Camera entity. This will allow later rendering code to tie a Draw to a specific view. A Camera can have a few different types of render targets, but mainly (and the only thing we support right now) this is a window. So without getting into more complicated multi-camera setups think camera = view = window.
  2. In the renderer, we setup bind groups for our shader using bevy's view uniform, as well as any user supplied textures. The handling around textures is somewhat complicated right now, as bevy's asset system is fully asynchronous (see comment on Abstract the nannou::draw module into the new bevy_nannou_draw crate #955).
  3. We extract Draw into the render world, where our main prepare_view_mesh system, which populates a ViewMesh component and list of render commands that are stored in ViewRenderCommands. Both of these are inserted as components on a ExtractedView, which is the render world equivalent of the Camera we inserted Draw on.
  4. We run our NannouViewNode in the render graph at the end of the core 3d pass, which executes our nannou shader and writes to the texture provided by bevy. This means that: (a). we can draw on top of things rendered in the bevy pbr mesh pipeline and (b). we hook into stuff like MSAA for free.
  5. Render ends and all of our components are dropped for the next frame.

How to review this pull request

Most of the files that have been copy/pasted from the existing code base into the bevy crates have been relatively lightly modified and should be mostly 1:1 copies and likely do not need super close review. I went back and forth whether these should just be modified in place to provide a better diff, but this would break the build on our branch and so I think the cost of ruining the diff history is worth it.

The primary changes were:

  • Use bevy's wgpu wrappers
  • Replace Rc<RefCell<T>> in draw with Arc<RwLock<T>>
  • Replace Point2 with Vec2, etc
  • Use bevy's Handle<Image> instead of wgpu::TextureView

TODO:

  • Support user uploaded textures using the Bevy asset system.
  • MSAA support.
  • Confirm we can use bevy's dpeth texture.
  • Figure out if text still works.
  • Re-implement draw command analog to support switching textures, rendering multiple mesh, scissor, etc.
  • Move ViewMesh to render world in prep for draw work.

@mitchmindtree mitchmindtree linked an issue Jan 29, 2024 that may be closed by this pull request
@tychedelia tychedelia marked this pull request as ready for review January 31, 2024 23:56
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this isn't necessary long term but is a convenient way to run things for now

@tychedelia
Copy link
Collaborator Author

tychedelia commented Feb 1, 2024

This diff is pretty hard to review, but I wanted to give @mitchmindtree and @JoshuaBatty a chance to review or comment prior to merging this into our bevy-refactor branch in preparation for some initial Draw work. There's some outstanding issues here that will be easier to fix with more robust examples running. I've tried to mark things with TODO but this should still be understood as a first pass.

The main file I would look at is pipeline.rs, which outlines the basic flow. There's some stuff currently missing there, which is a consequence of lacking the draw api -- for example, right now, we don't even create pipelines (removed in a37611e). Happy to answer more questions here, and going to add a few notes to #954 in terms of additional learnings.

@tychedelia tychedelia changed the title #945 Bevy render PoC #954 Bevy render PoC Feb 1, 2024
..default()
});

commands.spawn(
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should probably spawn the camera for the user since we need to manage something like the orthographic settings and the clear color.

brightness: 1.0,
});
commands.spawn(PbrBundle {
mesh: meshes.add(Mesh::from(Torus {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was just to test that we draw correctly over bevy meshes (we do)

@@ -7,3 +7,4 @@ target/
**/*.rs.bk
.DS_Store
.idea/
result/
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mitchmindtree I ignored this but it's also accidentally checked in. What's the best practice here for nix? Was surprised to see this not already checked in since you use nix, do you have this in your global .gitignore?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you've done the right thing here! I'm just in the habit of clearing result I think and forgot to add it 😅

@JoshuaBatty
Copy link
Member

JoshuaBatty commented Feb 9, 2024

Awesome work @tychedelia ! so cool to see this and encouraged by how much of stuff we can offload to bevy so far. I'm happy with the approach taken here, i'm sure there will be some more changes as you go along so don't want to hold this PR and the flow up.

I'll wait for @mitchmindtree to have a look and give the final approval but looks good from my end.

@tychedelia
Copy link
Collaborator Author

(Note for self) Right now we're using a ViewNode to drive the render, but I think that we might be able to more idiomatically render our mesh using Bevy's PhaseItem/RenderCommand pattern. See the bevy_gizmos crate for an example of what this might look like.

@tychedelia
Copy link
Collaborator Author

Waiting on this NixOS/nixpkgs#289940.

@tychedelia tychedelia merged commit bf58f0d into nannou-org:bevy-refactor Feb 27, 2024
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
3 participants