EVER: Exact Volumetric Ellipsoid Rendering

for Real-time View Synthesis


Arxiv

Alexander Mai, Peter Hedman, George Kopanas, Dor Verbin, David Futschik, Qiangeng Xu, Falko Kuester Jonathan T. Barron, Yinda Zhang,


Paper
/static/1d1f23ca995979b883b96921be5a269a/figure1.png

An overview of the quality benefits of our EVER technique. Left: On the Zip-NeRF dataset, our model produces sharper and more accurate renderings than 3DGS and successor splatting-based techniques. Middle: Because our method correctly blends primitive colors according to the physics of volume rendering, it produces fewer “foggy” artifacts than splatting. Right: Our method correctly blends primitive colors, which is not possible using a splatting regardless of how primitives are sorted (globally or ray-wise).

Abstract

We present Exact Volumetric Ellipsoid Rendering (EVER), a method for real-time differentiable emission-only volume rendering. Unlike recent rasterization based approach by 3D Gaussian Splatting (3DGS), our primitive based representation allows for exact volume rendering, rather than alpha compositing 3D Gaussian billboards. As such, unlike 3DGS our formulation does not suffer from popping artifacts and view dependent density, but still achieves frame rates of  ⁣30\sim\!30 FPS at 720p on an NVIDIA RTX4090. Because our approach is built upon ray tracing it supports rendering techniques such as defocus blur and camera distortion (e.g. such as from fisheye cameras), which are difficult to achieve by rasterization. We show that our method has higher performance and fewer blending issues than 3DGS and other subsequent works, especially on the challenging large-scale scenes from the Zip-NeRF dataset where it achieves SOTA results among real-time techniques.

Method

figure2 (a) Here we show a toy “flatland” scene containing two primitives (one red, one blue) with a camera orbiting them, viewed from above. We render this orbit using three different techniques, where each camera position yields a one-dimensional “image” (a scanline) which are stacked vertically to produce these epipolar plane image (EPI) visualizations. (b, c) The approximations made by approximate splatting-based techniques result in improper blending due to discontinuities, which are visible as horizontal lines across the EPI. In contrast, (d) our method’s exact rendering yields a smooth EPI, with bands of purple from color blending.

figure3 A visualization of our rendering procedure. Top: We cast a ray through a field of constant density ellipsoids and compute each ray-ellipsoid collision distance to get the endpoints of each step function. When the ray enters each primitive, the density along the ray increases. When it exits, the density drops back down a corresponding amount. Bottom: This lets us analytically integrate the volume rendering equation through the field.

Qualitative Examples

EPI Videos

Extra Datasets

US National Park Service 2023: Vought 4Fu Corsair Planewreck - Midway Island - Photogrammetry - Terrestrial . Collected by US National Park Service Submerged Resource Center . Distributed by Open Heritage 3D. https://doi.org/10.26301/8tt8-9f41

Bibtex