Skip to content

georgeguida/3D_Neural_Synthesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

3D_Neural_Synthesis: Gaining Control with Neural Radiance Fields

This research introduces a novel 3D machine learning-aided design approach for early design stages. Integrating language within a multimodal framework grants designers greater control and agency in generating 3D forms. The proposed method leverages Stable Diffusion and Runway's Gen1 through the generation of 3D Neural Radiance Fields (NeRFs), surpassing the limitations of 2D image-based outcomes in aiding the design process. This paper presents a flexible machine-learning workflow taught to students in a conference workshop and outlines the multimodal methods used - between text, image, video, and NeRFs. The resultant NeRF design outcomes are contextualized within a Unity agent-based virtual environment for architectural simulation and are experienced with real-time VFX augmentations. This hybridized design process ultimately highlights the importance of feedback loops and control within machine-learning-aided design processes.

George Guida (Harvard University / ArchiTAG); Daniel Escobar (OLA Research); Carlos Navarro (OFFICEUNTITLED)

Full Paper Upcoming.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published