This AI-assisted method uses Stable Diffusion and ControlNet within Rhino and Grasshopper. It streamlines the design process, allowing for rapid visual experimentation and iteration directly within the 3D modeling workflow.
The AIR STREaM project followed a "Research through Design" methodology, exploring various AI tools across text, image, and audio domains. This iterative process involved testing and integrating platforms like Stable Diffusion and ControlNet into existing 3D modeling workflows to enhance real-time rendering capabilities.
Conventional workflows are time-consuming, involving multiple steps from 3D modeling to final rendering, which slow down design feedback loops.
The new pipeline integrates AI for immediate visualization during 3D modeling:
This AI-assisted method, which uses Stable Diffusion and ControlNet within Rhino and Grasshopper, streamlines the design process. It allows for rapid visual experimentation and iteration directly within the 3D modeling workflow. Below are some samples demonstrating the depth extraction of a 3D model and its AI render.