Skip to content

robot-learning-freiburg/ULOPS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Open-Set LiDAR Panoptic Segmentation Guided by Uncertainty-Aware Learning

This repository contains the PyTorch re-implementation of our IROS'2025 paper Open-Set LiDAR Panoptic Segmentation Guided by Uncertainty-Aware Learning. The repository builds on MMDetection3D.

1. What is Open-Set LiDAR Panoptic Segmentation?

Standard LiDAR panoptic segmentation models operate under a closed-set assumption, meaning they are trained on a fixed set of classes.
This poses a serious limitation for real-world robotics and autonomous driving, where models may confidently misclassify novel objects as a known class, creating significant safety risks.

Open-Set LiDAR Panoptic Segmentation (OSLPS) addresses this challenge with two key goals:

  1. Correctly segment all known classes : including both “stuff” (e.g., roads) and “things” (e.g., cars).
  2. Reliably detect and cluster unknown or novel objects that the model has never seen during training.

Teaser


2. Our Approach: ULOPS

ULOPS (Uncertainty-Guided Open-Set LiDAR Panoptic Segmentation) introduces a new, principled approach to the OSLPS problem. ULOPS learns to distinguish known from unknown using supervised predictive uncertainty.

Our method is built on two core contributions:

1. Principled Uncertainty

We use Dirichlet-based evidential learning to generate robust uncertainty estimates for each point in the LiDAR space.

2. Uncertainty-Driven Losses

We introduce three novel loss functions:

  • Uniform Evidence Loss
  • Adaptive Separation Loss
  • Contrastive Uncertainty Loss

These losses leverage a small set of “unknown” examples during training to actively teach the model to create a clean and sharp boundary between known and unknown regions.

Overview of ULOPS Architecture

If you find this code useful for your research, we kindly ask you to consider citing our papers:

@article{mohan2025open,
  title={Open-Set LiDAR Panoptic Segmentation Guided by Uncertainty-Aware Learning},
  shorttile={ULOPS},
  author={Mohan, Rohit and Hindel, Julia and Drews, Florian and Gl{\"a}ser, Claudius and Cattaneo, Daniele and Valada, Abhinav},
  journal={arXiv preprint arXiv:2506.13265},
  year={2025}
}

Project Status

Work in Progress
This version provides the foundational codebase for ULOPS, including the main framework structure and placeholders for upcoming modules.
Expect incremental updates with full training, evaluation, and benchmarking support.

Usage

# Training
bash tools/dist_train.sh /path/to/your/config 8

# Inference
bash tools/dist_test.sh /path/to/your/config /path/to/your/checkpoint.pth 8 --eval bbox

Pre-Trained Models

Pre-trained models can be found in the model zoo.

Acknowledgements

We have used utility functions from other open-source projects. We espeicially thank the authors of:

Contacts

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages