JI.
Back to Projects

Spatial Capture

NeRF / Photogrammetry / Point Cloud

Three approaches to capturing the real world in 3D. Neural radiance fields and Gaussian splats for photorealistic volumetric scenes. Photogrammetry for production-ready textured meshes. Point clouds for raw spatial data at scale. Each method has a different output, a different strength, and a different place in the pipeline.

3Capture Methods
9+NeRF / Splat Scenes
Real-Time3DGS Playback
PhonePrimary Capture Device
RoleCreator / Capture Artist
MethodsNeRF, 3DGS, Photogrammetry, Point Cloud
CaptureiPhone, LiDAR, Drone
ToolsLuma AI, Polycam, Blender, Nerfstudio
OutputVolumetric Scenes, Meshes, Point Clouds
SubjectsArchitecture, People, Objects, Nature

Three Methods, Three Outputs

NeRF / Gaussian Splat

Volumetric Scene

A neural or point-based representation of a scene learned from photos or video. The model encodes how light behaves in the space — reflections, translucency, specular highlights — and renders photorealistic novel viewpoints in real time.

Best For

Immersive walkthroughs, preserving lighting conditions, web-delivered 3D, virtual tours, film previs, archival of spaces as they actually look and feel.

Output

Navigable volumetric scene (WebGL, splat file)

Photogrammetry

Textured Mesh

Reconstructs actual polygon geometry with UV-mapped photo textures from overlapping images. Produces a real 3D mesh — vertices, faces, normals — that can be edited, rigged, animated, 3D printed, or dropped into any standard 3D pipeline.

Best For

3D printing, game assets, VFX integration, product visualization, object scanning, anything that needs a manipulable mesh with real-world texture.

Output

Textured 3D mesh (OBJ, FBX, GLB, USDZ)

Point Cloud

Raw Spatial Data

Millions of individual 3D coordinates captured by LiDAR or depth sensors, each carrying color and position data. No surfaces, no mesh — just raw points in space. The most direct representation of scanned geometry before any processing or interpretation.

Best For

Surveying, architecture, construction documentation, large-scale environments, scientific measurement, forensic archival, and as raw input for mesh reconstruction.

Output

XYZ coordinate data with color (PLY, LAS, E57)

How They Relate

These aren't competing technologies — they're complementary tools for different problems. Photogrammetry gives you a mesh you can manipulate in Blender, rig, animate, or send to a 3D printer. NeRFs and Gaussian splats give you a scene you can walk through with photorealistic lighting that a mesh can't replicate. Point clouds give you raw spatial truth — the measured coordinates of a space before any interpretation is applied.

In practice, they often feed into each other. A point cloud can be the starting data for a photogrammetric mesh. A photogrammetric scan can inform a NeRF training set. The same walk-around video footage can produce all three outputs depending on how it's processed. Understanding the strengths and limitations of each method is what determines which tool fits the job.

Quick Comparison

Need to walk through a space?NeRF / Splat
Need an editable 3D model?Photogrammetry
Need precise measurements?Point Cloud
Need to 3D print it?Photogrammetry
Need photorealistic lighting?NeRF / Splat
Need raw scan data?Point Cloud

NeRF / Gaussian Splat Captures

Each capture below is a fully navigable volumetric scene. Click and drag to orbit. Scroll to zoom. These are live 3D models running in your browser, not pre-rendered video.

Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...
Loading 3D...

Tech Stack

Luma AINerfstudio3D Gaussian SplattingPolycamRealityCaptureiPhone LiDARDJI DroneBlenderStructure from MotionCOLMAPWebGLPoint CloudsPhotogrammetry

More Projects