Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to incorporate it into project? #75

Open
BeryEngine opened this issue Jan 18, 2023 · 9 comments
Open

How to incorporate it into project? #75

BeryEngine opened this issue Jan 18, 2023 · 9 comments

Comments

@BeryEngine
Copy link

Hi i'm very new with Github, and maybe i'm very unexperiment with it but, i read the Readme file of course and i did not find how to add the ray-traced path into an existing project? like short steps to did it, instead of copy pasting your work from your demo file, may be i miss something but in case...

very impressed that someone already adapte a ray-traced,

@erichlof
Copy link
Owner

erichlof commented Jan 18, 2023

Hello @BeryEngine

Welcome to GitHub! I remember when I first joined back in 2013 (10 years ago now!), it took me while to get used to everything - and I'm still learning how to use some of the features. I guess you're never quite done learning GitHub (lol)!

About incorporating my ray/path tracing rendering project into your projects, I'm sorry to report that at this time, it's not possible to just include my code, or drag and drop one of your three.js scenes into my rendering system.

Perhaps the name of my project, the Three.js PathTracing Renderer, is a little misleading, as it might imply that the end user can simply switch the rendering path from the default three.js WebGL renderer to my PathTracing renderer with the click of a button, or by adding a couple of lines of code that simply directs the renderer to do this.

This project has three.js in its title because it does indeed rely on the three.js library as its hosting environment. If I didn't start with three.js as the foundation, I would have had to hand-code all of the math libraries, glTF loaders, and raw WebGL setup, just to get something on screen without errors in the browser console. Instead, I rely on three.js for all the boilerplate and utilities code so that I can focus on the ray tracing parts that I'm interested in.

But since three.js wasn't created or designed with this ray tracing rendering path in mind, I've had to sort of 'hack' into it and divert all of its objects definitions and rendering to the GPU through custom shaders, written in GLSL. The bulk of my project is done entirely on these specialized GPU shaders.

Therefore, once I leave the comfort of the three.js default WebGL rendering system behind, I also lose the easy scene drag and drop functionality, as well as defining the scene objects in JavaScript with three.js commands like "let mySphere = new THREE.Sphere(geometry, material)"..etc. In fact, if you look through my GLSL shaders, you'll see that the scene is hard coded and defined entirely in the shader. In other words, the CPU and most of your computer has no idea about the scene and its contents. This is definitely not how the creators of three.js envisioned their library being used in the browser.

That being said, if you still want to try out your own scene using my render path, you can either define (hard code) your own objects and their size and placement inside your shader, or you can use the three.js library to load a single glTF model. There are plenty of examples on this repo of how to do both. However, please note that the glTF loading and rendering only works on very basic models with 1 material texture wrapped around the entire object. At this time, loading a scene full of different glTF models, or loading a glTF file with multiple components, each with their own transform and materials/textures, is not supported.

One of my long-term goals is to do exactly what you are requesting and have it be where you can simply call familiar three.js commands like 'new THREE.PointLight' and 'new THREE.Sphere' or 'new THREE.Box' and it automatically gets sent over to my shader ray tracing system. Same goes for drag and drop scenes and glTF models/scenes. But this will require a lot more careful study and planning on my part before I could be able to implement functionality like that.

Hope this helps and clears up the state of usability of my project within other projects. If you have any other questions, please feel free to post here. :)

-Erich

@yure-r
Copy link

yure-r commented Jan 19, 2023

Hi @erichlof!

I've been following this project for quite some time and I wanted to ask a similar question about functionality, and I'm glad to know that your goals for the future include adding the ability to include a normal THREE.js scene in your rendering system via a "drop-in" function of some sort.

I looked into a couple renderers that do exactly what you're describing, and I'm sure you're already aware of both of them (especially since the three.js raytracing scene right now is extremely small):

lgltracer is a really efficient rendering system that compresses the scene into a BVH, and then renders the scene from that BVH and it seems to use a lot of efficient estimation algorithms to make the pathtracing happen faster. Unfortunately it's not open source and it's no longer maintained, which was really heartbreaking to me (but apparently the creator is looking into webgpu?? (but so are you!!)).

Later I found your renderer, which I thought had that same functionality, and given how much time and energy you spent on it, was also a bit heartbreaking knowing that I'd need to figure out how to define objects in GLSL in order to employ your really diligent work in this raytracer. I'm really holding out hope that you could include some sort of conversion system for your renderer as well, since it runs incredibly fast and you've spent a great deal of time and energy over a span of literal years to get here, it would allow thousands of people to access your work in an easier way. Since you spent so much time getting things like caustics and emission to work, I think many would benefit a lot from being able to play around with this in realtime in the browser. I'm sure you know how groundbreaking it is, but I just can't stress enough how this has made me reconsider what the browser can do, and the kind of wild computational possibilities that are made accessible by something like this.

Later I found three-gpu-pathtracer which you totally know about, because in my lurking I saw that you mentioned the creator at some point, and that he helped you accomplish something (he's really nice!). It's also a wonderful rendering system, and also uses a BVH conversion to render the scene. I think maybe your best bet is to work with him to figure out how to implement a similar conversion system to your renderer so that everything can play nicely together.

I don't want to overstep, but lately I've just been very passionate about raytracing, and since I found out that it can be accomplished in the browser (let alone run at real time?????) I've been floored to make things with it, and yearning to use your renderer to do it.

I'm not sure why I decided to stop lurking now, but I'm sure that I'm not the only person who would be impacted in a severely wonderful way by being able to use your renderer to make wild, emissive, caustic, and maybe sub-surface-scattered, things in the browser.

I know lgltracer and three-gpu-pathtracer are perfectly usable renderers, but you've been working on your renderer for far longer, and support a much wider variety of light transmission estimations than the other renderers do, which is why I'm so particular about using yours in my work.

Thank you for reading :)

@BeryEngine
Copy link
Author

Thanks you for all, it's very interresting, i may look a bit further to see if i can use a simple trick, but nothing is simple in coding obviously, anyway i will find something to replace my need.

At the moment i'm developping my 3D web site (in a very contemplative presentation of my design work) so i will reviw my graphical objectifs and will add some pretty stuff to make it fine. I'm very good in 3D modeling and UV edditing i will probably search into post processing and complex shaders !!

@erichlof
Copy link
Owner

Hello @yure-r !
Thank you for your kind words and support. It makes me happy knowing that people like yourself are enjoying my project and getting some benefit from it, whether it's more educational in nature, or practical as in a library or utility for their own projects.

Also thank you for the links and suggestions - as you mentioned, I was aware of Garrett's three gpu pathtracer project, but I haven't looked deeply into how he loads and handles glTF files and/or regular three.js scenes defined in the usual JavaScript way. And if I don't understand something in his code, you're right - he would help me out (he is nice indeed!).

As for lgltracer, I had tried it out when it was first made public, but actually until you mentioned it, I hadn't been checking in on the project. It has come a long way! The glTF model drag-and-drop to path trace functionality is really cool. That utility alone is a game changer in the web ray tracing space. I would love to have something like that. Also the button that lets you switch between ray traced and normal three.js WebGL rendering paths is really slick. And the newest feature that I wasn't even aware of until today - an editor - wow! It was so cool to be able to drag pieces of the scene/model around, change material parameters, and then the path tracer rebuilds the BVH in the background, and voila- your updated glTF model/scene is path traced! That must have taken a monumental effort. It's a shame the developer doesn't want to make it open source. We could learn so much from that 1 tool/feature that he implemented.

You may have read my replies elsewhere on this message board that say that although I am comfortable with the nuts and bolts of rendering - ray tracing, path tracing, etc., my weaknesses lie in data consumption, loading and parsing files like glTF, and scene graph representation (as in scene editors). I keep hoping that one of these days someone like the creator of the lgltracer will come along and submit a PR that handles glTF-to-path traced triangle representation, with all of its details and edge cases. Until then however, I will keep plodding along in this realm. I mainly began this project because I love all things ray tracing, but I'm realizing that at a certain point we need to face the reality of the general user's experience and ease-of-integration with their own glTF models (which are now the web standard), and/or scenes created on the JavaScript side using the familiar three.js library calling conventions.

As an experiment, I might try to start simple and make a JavaScript conversion routine that takes a three.js library call, like new THREE.PointLight, or new THREE.Sphere or new THREE.Box, and generates a raytracing-friendly intersectable shape inside my custom shaders. I can almost see how that would work, but the devil is in the details, ha, so i will just have to dive in. I think that if I at least got something simple working that handles familiar three.js JS library shape construction calls, that the end user could, in theory, define a scene purely in JavaScript, and then the path tracing representation utility would do its magic under the hood, and then we could have path traced three.js scenes. Not all of the calls can be a direct 1:1 relationship with the shader geometry ray tracing representation, but I'm confident I could get the more commonly used shapes like spheres, cones, cylinders, boxes, planes, etc. working. And then on the glTF front, I might ask Garrett how he handles glTF models with multiple, transformed components and boils all of that down into a data structure that can be fed into a BVH. Multiple materials with possible different textures, all in the same model file, will be the most challenging thing for me personally, because like I said- I don't work daily in that space like I do with ray tracing, and a lot of the data handling/parsing is foreign to me. But I will keep at it!

Thanks again for your encouragement!
😊
-Erich

@yure-r
Copy link

yure-r commented Jan 20, 2023

@erichlof Thank you for your extremely generous response. I also just wanted to put a couple bugs in your ear about WebGPU Pathtracing and/or using a BVH to index multiple materials on one mesh in a converted THREE.js scene.

Mainly about the latter, three-mesh-bvh, maintained and created by Garrett, which I know you know about, is how he manages to convert the three.js scene to be used in the pathtracer. I looked into multiple material use because I realized that if you compress the entire scene into one object, multiple materials on a single mesh must be used somehow. I found that materials are indexed and defined by groups in that BVH (defined by the materialIndex buffer attribute of the bvh) and there's some kind of logic in place to query the bvh to show any particular material and when. There could be an optimization effort to actually place these items next to each other to have a lower amount of groups, which would reduce draw calls to the number of materials instead of to the number of groups because it would reduce the total number of groups. Anyway... UV maps work! I tried to optimize it and failed miserably because this isn't my area of expertise (really I just like to have fun). I thought it may be helpful to let you know that the problem you mentioned is getting some action already, and it's probable that you can implement this to get you most of the way to where you want to be.

I'm excited for what the future will bring for this project in particular. I think that you're at a great point here :) Thank you again!

@vinkovsky
Copy link

Hello @erichlof! Is your email valid? i have something for you

@erichlof
Copy link
Owner

@vinkovsky Hi, Yes it's valid! :)

@SBtree-bit
Copy link

@erichlof How is the progress on implementing this? I see that the last time you talked about this was in January of last year, and I'd like to know how far this has come.

@erichlof
Copy link
Owner

Hi @SBtree-bit ,

I have made some small steps in the quest to have a total conversion utility in place (either three.js library commands to pathtracing representation, or drag-and-drop glTF scenes to pathtracing representation).

If you take a look at my Invisible Date demo, I instantiate every piece of scene geometry (about 50 various three.js shapes, each with its own material and transform) on the CPU js side at startup, and then everything gets fed into a BVH builder, which allows efficient pathtracing on the GPU side. This is one step closer to being able to define any arbitrary scene with three.js JavaScript commands, and then have it all run through my pathtracing renderer.

Unfortunately, on the glTF side, there hasn't been much progress, compared with the three.js shapes system I mentioned above. However, as it so happens, Garrett, the developer of the three-gpu-pathtracer, just tweeted that he is about to greatly simplify the use of his library!

https://twitter.com/garrettkjohnson/status/1769947279925162017?t=g9lvJabV_vm2EecdihF5kQ&s=19

Also in related news, mrdoob, the creator of three.js, recently took it upon himself to integrate Garrett's three-gpu-pathtracer into the three.js editor! Now it is as simple as selecting the 'realistic' render from the editor's drop-down menu!

This integration, coupled with Garrett's recent work to make his system as easy as possible to use for everyone, leads me to point you and other users who want to be able to use a pathtracing renderer for their projects instead of the default three.js WebGL renderer, to his project for the time being. Although Garrett has not implemented this ease-of-use feature yet, knowing his track record, it'll probably only take a couple of weeks! 😉

On my end, I will try to keep moving forward, but as I mentioned in previous replies, I am ultimately interested in the ray tracing side of things, and there are so many rendering rabbit holes that present themselves, plus working on my new ray tracing youtube series, that it's hard for me not to meander, ha! Anyway, please let us know here if you, or anyone else reading this issue thread, is able to easily use Garrett's new system in their own projects without too much trouble.

Cheers,
-Erich

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants