
01/01/2023
Sunday
Project Overview
This project explores the use of Unity and Luma3D to generate an animated visualization of particles and space. By combining generative AI with 3D reconstruction, I created a surreal digital environment that blends ambient motion with abstract spatial design. The goal was to experiment with mood-driven aesthetics and push the boundaries of how digital space can be experienced.
Inspo.
This project was conceptually inspired by Sujin Kim’s artist talk at Carnegie Mellon, where she discussed her experimental animations and use of tools like Luma3D. Her film Unforgotten and other works explored spatial storytelling through surreal visuals and volumetric techniques. I was drawn to how she reimagines digital space with emotional and atmospheric depth, which led me to experiment with similar methods to create a calm, particle-based animation in Unity.


left: Unforgotten (animated short) | right: idea sketch #4 by artist Sujin Kim
Luma 3D Application
...
Experimentation
I experimented with Luma3D across various environments and lighting conditions to better understand its strengths and limitations. Through testing, I found that it performs best in moderately sized spaces with consistent natural lighting. In contrast, artificial lighting, especially harsh or uneven sourcestended to disrupt the 3D generation process significantly, resulting in less stable or incomplete reconstructions.
workflow 1

workflow 2

Editing
I ultimately chose two Luma3D generations that depicted hallway-like environments. These spaces allowed for greater flexibility in camera movement due to the extended surface area captured in the Gaussian splat, creating a more immersive and spacious visual experience. During experimentation, I also discovered that the splat data could be visualized in two forms: the point-based debug format and the full splat rendering format, each offering different visual and performance characteristics. Below are some documented mistakes I made and lessons I learned (since it was my first time using Unity).
Mistakes and lessons learned:
-
I accidentally used Unity’s HDRP (High Definition Render Pipeline) instead of URP (Universal Render Pipeline), which caused material and texture loading issues.
-
The Gaussian splat Unity workflow (by Toy’s Gaussian Splat) was outdated, leading to compatibility issues and requiring downgraded features.
-
I had to ensure all textures and models were URP compatible to avoid rendering problems.


example video: version 1 of scene 2

top image: workflow on Premiere Pro
Editing II
To curate the final video, I used Premiere Pro to layer and edit multiple recordings captured in Unity. Since I was using a tracked camera with keyframes, I realized I could switch between different environments and presets while maintaining the exact same camera motion. This allowed me to record several versions of the same scene, each with different visual treatments. In Premiere, I overlaid these clips and adjusted their opacity and exposure settings to blend them together. The result was a surreal digital landscape that felt fluid and immersive to me. This was an aesthetic choice I found personally pleasing and reflective of the mood I wanted to convey.