CVPR 2026
Machine Perception Lab, Wayne State University
Paper (arXiv 2603.09277) | Project Page
- Method Overview
- Results
- Installation
- Dataset Preparation
- Evaluation
- Code Changes
- Acknowledgements
- BibTeX
Overview of our method and effects. (a) The Gaussian list of 3DGS. (b) Distribution of Gaussian list length showing our method produces significantly shorter lists. (c) Gaussian list reduction after applying scale reset. (d, e) Scale and opacity distributions comparing 3DGS and 3DGS with scale reset, showing scale reset produces smaller Gaussians with higher opacities. (f) Gaussian list reduction after applying entropy regularization. (g, h) Scale and opacity distributions comparing 3DGS and 3DGS with entropy regularization, demonstrating entropy constraint produces smaller Gaussians and more polarized opacities. "3DGS" results are produced with LiteGS.
Results are obtained on Ubuntu 24.04.2 LTS with an NVIDIA GeForce RTX 5090 D (32 GB VRAM), CUDA 12.8, Python 3.10.16, and PyTorch 2.8.0.
This repo is built upon LiteGS (stable branch). For installation, please follow the instructions in 3D Gaussian Splatting and LiteGS (stable). Prerequisites include a CUDA-capable GPU with the CUDA toolkit installed, and a conda environment with Python and PyTorch. After that, the submodules need to be compiled and installed.
git clone --recursive git@github.com:MachinePerceptionLab/ShorterSplatting.gitThen install the submodules:
cd litegs/submodules/fused_ssim && pip install . && cd -
cd litegs/submodules/simple-knn && pip install . && cd -
cd litegs/submodules/gaussian_raster && pip install . && cd -
cd litegs/submodules/lanczos-resampling && pip install . && cd -We evaluate on Mip-NeRF 360, Tanks and Temples, and Deep Blending. Please follow their official instructions to download and organize the data.
To reproduce the paper's quantitative results:
python ./full_eval.py --save_images \
--mipnerf360 /path/to/mipnerf360 \
--tanksandtemples /path/to/tanksandtemples \
--deepblending /path/to/deepblending \
--enable_dash --lambda_entropy 0.015 --scale_reset_factor 0.2Full evaluation with baseline LiteGS:
python ./full_eval.py --save_images \
--mipnerf360 /path/to/mipnerf360 \
--tanksandtemples /path/to/tanksandtemples \
--deepblending /path/to/deepblendingPrint and compare stats:
python litegs/spreading/misc/print_stats.py --no_training \
output-m360-litegs \
output-m360-litegs+dash+reset.0.2+entropy.0.015python litegs/spreading/misc/print_stats.py --no_training \
output-db-litegs \
output-db-litegs+dash+reset.0.2+entropy.0.015python litegs/spreading/misc/print_stats.py --no_training \
output-tat-litegs \
output-tat-litegs+dash+reset.0.2+entropy.0.015This code is built upon LiteGS (stable branch). To see the changes made on top of LiteGS stable, run:
git diff --compact-summary litegs_stable HEAD -- . ':(exclude)*lanczos*' ':(exclude).vscode' ':(exclude)assets' ':(exclude)doc_img'Key changes include:
- Scale reset — scheduling policy and reset operation
- Entropy regularization — scheduling policy, loss term, and custom CUDA backward pass
- DashGaussian integration — incorporating DashGaussian
This project was partially supported by an NVIDIA academic award and a Richard Barber research award.
We sincerely thank the authors of LiteGS, DashGaussian, and 3D Gaussian Splatting for their excellent open-source work, which forms the foundation of this project.
@InProceedings{Liu2026shortersplatting,
title = {Speeding Up the Learning of 3D Gaussians with Much Shorter Gaussian Lists},
author = {Liu, Jiaqi and Han, Zhizhong},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year = {2026}
}- 2026-03-18: First release.


