Off-axis projection system for virtual production (nDisplay) #1406
Replies: 4 comments 1 reply
-
XR_Action branch can do the tracking when it merges
…On Fri, Feb 12, 2021 at 3:01 PM Andy Jarosz ***@***.***> wrote:
In Virtual Production, there are thousands of people looking for an
alternative to Unreal for for xR video wall workflows.
https://youtu.be/OV7WKh_fLFk
The feature in Unreal that allows this to happen is nDisplay, which
provides both off-axis (asymmetric frustum) rendering as well as multi-node
rendering for large scenes.
https://docs.unrealengine.com/en-US/WorkingWithMedia/nDisplay/index.html
Essentially, nDisplay is projecting the view from a virtual camera onto a
mesh that represents the physical LED screen being used. The texture of
that mesh becomes the render target, and looks correct from that
perspective in the physical world.
UPBGE already has the ability to render to a texture, and people have done
this in the past with BGE--but making it easy to use could draw in a
massive amount of people into the project.
http://paulbourke.net/papers/blender10/blenderpaper.pdf
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#1403>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABW3SWL3I73Q5BU467CG5VDS6WXLFANCNFSM4XRO43OA>
.
|
Beta Was this translation helpful? Give feedback.
-
That's fantastic, that's another key element to this. The issue with Unreal is that their system is incredibly unstable and difficult to configure, but we all slog through because it's currently the only option available. |
Beta Was this translation helpful? Give feedback.
-
https://www.youtube.com/watch?v=bhVAXoPBheE
I have had a great time with WMR / samsung odyssey+
I also wrote a script to record animation inside upbge including armatures
https://www.youtube.com/watch?v=cUXQkSkzpqU
On Fri, Feb 12, 2021 at 3:18 PM Andy Jarosz <notifications@github.com>
wrote:
… XR_Action branch can do the tracking when it merges
… <#m_6007831630282861956_>
On Fri, Feb 12, 2021 at 3:01 PM Andy Jarosz *@*.***> wrote: In Virtual
Production, there are thousands of people looking for an alternative to
Unreal for for xR video wall workflows. https://youtu.be/OV7WKh_fLFk The
feature in Unreal that allows this to happen is nDisplay, which provides
both off-axis (asymmetric frustum) rendering as well as multi-node
rendering for large scenes.
https://docs.unrealengine.com/en-US/WorkingWithMedia/nDisplay/index.html
Essentially, nDisplay is projecting the view from a virtual camera onto a
mesh that represents the physical LED screen being used. The texture of
that mesh becomes the render target, and looks correct from that
perspective in the physical world. UPBGE already has the ability to render
to a texture, and people have done this in the past with BGE--but making it
easy to use could draw in a massive amount of people into the project.
http://paulbourke.net/papers/blender10/blenderpaper.pdf — You are
receiving this because you are subscribed to this thread. Reply to this
email directly, view it on GitHub <#1403
<#1403>>, or unsubscribe
https://github.com/notifications/unsubscribe-auth/ABW3SWL3I73Q5BU467CG5VDS6WXLFANCNFSM4XRO43OA
.
That's fantastic, that's another key element to this.
The issue with Unreal is that their system is incredibly unstable and
difficult to configure, but we all slog through because it's currently the
only option available.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1403 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABW3SWOACJUA4IYCAWCGBMLS6WZKRANCNFSM4XRO43OA>
.
|
Beta Was this translation helpful? Give feedback.
-
This issue was solved in the past with this patch: https://developer.blender.org/rB2074128fadbfd58ea13a68cbccaa1f6771bbd710 But after the refactor of upbge render first and the refactor to adapt us to eevee render after, that code was removed. Maybe it can be used at starting point. I move this to discussions as it is a nice to have feature and is quite interesting. |
Beta Was this translation helpful? Give feedback.
-
In Virtual Production, there are thousands of people looking for an alternative to Unreal for for xR video wall workflows.
https://youtu.be/OV7WKh_fLFk
The feature in Unreal that allows this to happen is nDisplay, which provides both off-axis (asymmetric frustum) rendering as well as multi-node rendering for large scenes.
https://docs.unrealengine.com/en-US/WorkingWithMedia/nDisplay/index.html
Essentially, nDisplay is projecting the view from a virtual camera onto a mesh that represents the physical LED screen being used. The texture of that mesh becomes the render target, and looks correct from that perspective in the physical world.
UPBGE already has the ability to render to a texture, and people have done this in the past with BGE--but making it easy to use could draw in a massive amount of people into the project.
http://paulbourke.net/papers/blender10/blenderpaper.pdf
Beta Was this translation helpful? Give feedback.
All reactions