Assignment 3

Neural Volume Rendering and Surface Rendering

A. Neural Volume Rendering 80 points

0. Transmittance Calculation 10 points

Transmittance calculation for a ray going through a non-homogeneous medium.
Transmittance Calculation

Transmittance Calculation

1. Differentiable Volume Rendering 30 points

1.1-1.3 Ray Sampling and Visualization

Implementation of ray sampling from cameras and visualization of XY grid and ray bundles.
XY Grid Visualization

XY Grid

Ray Bundle Visualization

Ray Bundle

1.4 Point Sampling

Implementation of stratified sampling along rays.
Sample Points Visualization

Sample Points Along Rays

1.5 Volume Rendering

Implementation of volume rendering with color and depth computation.
Volume Rendering Result

Spiral Rendering

Depth Map

Depth Visualization

2. Optimizing a Basic Implicit Volume 10 points

2.1 Random Ray Sampling

Implementation of random ray sampling for efficient training.

2.2 Loss and Training

Mean squared error loss implementation and box parameter optimization.
Box Parameters:
Center: [INSERT BOX CENTER COORDINATES]
Side Lengths: [INSERT BOX SIDE LENGTHS]

2.3 Visualization

Optimized Box Rendering

Optimized Box Spiral Rendering

3. Optimizing a Neural Radiance Field (NeRF) 20 points

Implementation of NeRF MLP architecture and training on lego bulldozer dataset.
NeRF Rendering

NeRF Spiral Rendering

Write-up: I implemented the NeRF MLP with positional encoding for 3D coordinates and viewing directions. The architecture uses an 8-layer MLP with 256 hidden units to process position-encoded coordinates, outputting density and a feature vector. This feature vector is concatenated with encoded viewing directions and passed through a smaller RGB network to predict view-dependent color. I used ReLU activations for density and sigmoid for RGB values. The model successfully learned detailed geometry and realistic view-dependent effects on the lego bulldozer dataset.
# NeRF MLP Architecture
                                    self.linears = torch.nn.ModuleList(
                                        [torch.nn.Linear(embedding_dim_xyz, cfg.n_hidden_neurons_xyz)] +
                                        [torch.nn.Linear(cfg.n_hidden_neurons_xyz, cfg.n_hidden_neurons_xyz)
                                         for _ in range(cfg.n_layers_xyz - 1)]
                                    )
                                    
                                    self.density_out = torch.nn.Linear(cfg.n_hidden_neurons_xyz, 1)
                                    self.feature_out = torch.nn.Linear(cfg.n_hidden_neurons_xyz, cfg.n_hidden_neurons_xyz)
                                    
                                    self.color_mlp = torch.nn.Sequential(
                                        torch.nn.Linear(cfg.n_hidden_neurons_xyz + embedding_dim_dir, cfg.n_hidden_neurons_dir),
                                        torch.nn.ReLU(),
                                        torch.nn.Linear(cfg.n_hidden_neurons_dir, 3),
                                    )

4. NeRF Extras 10 + 10 Extra Credit

4.1 View Dependence

Implementation of view-dependent NeRF with materials scene results.
View-Dependent NeRF Results

View-Dependent NeRF Results on Materials Scene (Standard)

View-Dependent NeRF Results

View-Dependent NeRF Results on Materials Scene (High Res)

Analysis: I observed a clear trade-off between view-dependent rendering quality and generalization capability. The view-dependent model excelled at capturing realistic specular highlights and fine details from training viewpoints, but this specialization sometimes reduced robustness for novel views where the model had to interpolate between learned appearances. In contrast, view-consistent models generalized more reliably across all viewpoints but lacked the high-frequency details and material accuracy. The optimal balance depends on the application: view-dependence for fixed-viewpoint rendering quality versus view-consistency for robust novel view synthesis.

4.2 Coarse/Fine Sampling

Implementation of coarse/fine sampling strategy for NeRF.
Coarse/Fine NeRF Results

Coarse/Fine NeRF Results

Analysis: The coarse-to-fine approach trades speed for greatly improved the quality. While regular NeRF renders, it often misses fine details and produces noisier results in complex regions. The hierarchical sampling adds ~1.5x rendering time due to dual network evaluations, but delivers much sharper geometry and cleaner specular details by concentrating samples where they matter most. This is particularly beneficial for scenes with thin structures and reflective surfaces where standard NeRF's uniform sampling wastes computation on empty space.

B. Neural Surface Rendering 50 points

5. Sphere Tracing 10 points

Implementation of sphere tracing for rendering an SDF torus.
Sphere Tracing Torus

Sphere Tracing Torus Rendering

Write-up: I implemented sphere tracing by marching rays from the camera origin and iteratively stepping forward by the SDF distance at each point. For each iteration, I evaluated the implicit function to get the distance to the surface, then moved along the ray direction by that exact distance. I converged when the SDF fell below my threshold (0.001), ensuring I never overshot the surface. To handle memory and large scenes, I processed rays in chunks and used early termination for efficiency. The final hit points were then colored using the implicit function's color prediction.

6. Optimizing a Neural SDF 15 points

Implementation of MLP for neural SDF with eikonal regularization.
Input Point Cloud

Input Point Cloud

Neural SDF Result

Optimized Neural SDF

Write-up: I implemented a neural SDF using an MLP with skip connections and ReLU activations. The network takes 3D coordinates as input and outputs signed distance values. To enforce the SDF property, I used eikonal regularization that constrains the gradient norm to be 1 almost everywhere, ensuring valid distance fields. For hyperparameters, I used 6 hidden layers with 128 neurons and tuned the eikonal weight to 0.1 for stable optimization.

7. VolSDF 15 points

Implementation of VolSDF with SDF to density conversion and color prediction.
VolSDF Geometry

VolSDF Geometry

VolSDF Color

VolSDF Color Rendering

Analysis Questions

1. Alpha scales density values while beta controls transition sharpness; high beta creates smooth, blurry surfaces, whereas low beta produces sharp, precise surfaces.

2. Training is easier with high beta because its wider density transition provides more stable gradients throughout the optimization.

3. A more accurate surface is learned with low beta because it concentrates density at the surface boundary, eliminating geometric blur.
Write-up: I selected a low beta (0.05) to achieve sharp surface reconstruction, using a moderate interweight (0.1) to balance the termination loss without over-constraining the optimization. This combination enabled detailed geometry recovery while maintaining training stability for the lego bulldozer model.

7. VolSDF 15 points

Implementation of VolSDF with SDF to density conversion and color prediction.
VolSDF Geometry

VolSDF Geometry

VolSDF Color

VolSDF Color Rendering

Analysis Questions

1. Alpha scales density values while beta controls transition sharpness; high beta creates smooth, blurry surfaces, whereas low bias produces sharp, precise surfaces.

2. Training is easier with high beta because its wider density transition provides more stable gradients throughout the optimization.

3. A more accurate surface is learned with low beta because it concentrates density at the surface boundary, eliminating geometric blur.
Write-up: I selected a low beta (0.05) to achieve sharp surface reconstruction, compensated by a strong eikonal weight (0.02) to maintain SDF validity, and used a moderate interweight (0.1) to balance the termination loss without over-constraining the optimization, enabling detailed geometry recovery while maintaining training stability.

8. Neural Surface Extras

8.3 Alternate SDF-to-Density

Implementation and comparison of alternate SDF to density conversion methods.
Alternate SDF to Density

Alternate SDF to Density Conversion Results