We are the NeRFs

We are the NeRFs!
We are the NeRFs, we are the NeRFs!
We are the NeRFs.
We model scenes implicitly,
we make new views for you to see.
We are the NeRFs.

A NeRF is a neural radiance field,
new views of a scene can be revealed.
From images, and each camera's pose,
train a multi layer perceptron so it knows
the radiance from each location
and density information.
Let's sort of explain it in reverse
and talk about rendering first.
You don't need a grid or mesh or surface.
The way to visualise a NeRF is
volume rendering. Through each pixel's place
project a ray from the camera centre into space.
March along the ray, sample it in chunks, and
pass to the rendering function:
each chunk's radiance and density
from the MLP, for the pixel's RGB.

We are the NeRFs!
We are the NeRFs, we are the NeRFs!
We are the NeRFs.
Radiance is apparent light emission,
density controls its transmission
in a NeRF.

To render more realistic views,
your lighting model shouldn't be totally diffuse.
To handle scenes with specular reflectance,
use ray directions. That's view dependence.
The sum of the squares of (the rendered RGBs
take the ground truth images), is the loss the MLP
backpropagates, and you may appreciate
this is trivial to differentiate.
For greater resolution, make training chunks random.
And a tip for speed: train two NeRFs in tandem:
coarse and fine levels, and sample stratified,
the coarse NeRF's density's used a guide
to occupancy. The fine samples are biased
toward points on the ray where density is highest.
And if you wanna see more detail in your scene,
map x and d to higher frequencies.

We are the NeRFs!
We are the NeRFs, we are the NeRFs!
We are the NeRFs.
If your positions aren't encoded,
high frequencies will be eroded
in the NeRF.

A NeRF is a neural radiance field,
a NeRF is a neural radiance field,
a NeRF is a neural radiance field is a NeRF.

A NeRF is a neural radiance field,
and this NeRF has a wooden keel,
and this NeRF is a healthy meal is a NeRF.

We are the NeRFs!
We are the NeRFs, we are the NeRFs!
We are the NeRFs.
We've only talked about NeRF vanilla,
nowadays there's a whole flotilla
of new NeRFs.

Sampling one ray per pixel can look blocky and historical.
Instead sample frustums, exact or conical.
Make unbounded scenes effectively small
by contracting coordinates to inside a ball.
To disentangle object and lighting interference
separate static and transient appearance.
And for greater speed, use many tiny MLPs
or use none and model radiance harmonically.
If your camera poses can't be trusted,
train the NeRF while they're bundle adjusted.
When inputs are few, seed the NeRF with priors,
or relight, interpolate motion, or deform as required.
Generate NeRFs through a text prompt construction
or edit them by giving instructions.
With the trained objects, export them in a mesh
or use them directly in effects.

We are the NeRFs!
We are the NeRFs, we are the NeRFs!
We are the NeRFs.
An MLP learns density
and view dependent RGB.
We make new views for you to see!
We are the NeRFs.

You may also be interested in The Fundamental Matrix Song and The RANSAC Song.

Yale Song - guitar
Daniel Wedge - vocals, violin, lyrics, video.

Thanks to Cyrus Vachha for the AR VFX: see his video Creating VFX with NeRFs - Nerfstudio Blender Add-On Tutorial

NeRFs rendered from datasets by Mildenhall et al. (CC BY) available at https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1

Original Blender models for the synthetic models by:
Ship by Greg Zaal and Chris Kuhn (CC BY-SA): https://www.blendswap.com/blend/8167
Drums by bryanajones (CC BY): https://www.blendswap.com/blend/13383
Hotdog by erickfree (CC-0): https://www.blendswap.com/blend/23962
Microphone by UP3D (CC-0): https://www.blendswap.com/blend/23295
Ficus by Herberhold (CC-0): https://www.blendswap.com/blend/23125

Horns and T-Rex captures by Mildenhall et al. (CC BY): https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1

Licence: Creative Commons Attribution-ShareAlike (CC BY-SA) 4.0
So feel free to play this in lectures etc (though I'd be interested to hear from you if you do!)

Email: fmatrix at danielwedge dot com
29 January 2024

danielwedge.com home