Starred repositories
COLMAP - Structure-from-Motion and Multi-View Stereo
This repository contains a multi-fisheye camera SLAM. The underlying SLAM system is based on ORB-SLAM.
MonSter++: A Unified Geometric Foundation Model for Stereo and Multi-View Depth Estimation via the Unleashing of Monodepth Priors
OKVIS2-X: Open Keyframe-based Visual-Inertial SLAM Configurable with Dense Depth or LiDAR, and GNSS
ViPE: Video Pose Engine for Geometric 3D Perception
This repository provides a synchronized stereo matching pipeline using OAK cameras, generating RGB-D images with disparity-based depth and integrated IMU data. The output is fully compatible with R…
TAPIP3D: Tracking Any Point in Persistent 3D Geometry
A precise low-drift Visual-Inertial-Leg Odometry for legged robots
Visual-Inertial-Leg Odometry For Legged Robots
[ICRA 2025 Best Paper] MAC-VO: Metrics-aware Covariance for Learning-based Stereo Visual Odometry
📌 PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map [RSS' 25]
[CVPR 2025]MAGiC-SLAM: Multi-Agent Gaussian Globally Consistent SLAM
[CVPR 2025] A unified framework for Scene Coordinate Regression-based visual localization
A large scale non-linear optimization library
LightGlue: Local Feature Matching at Light Speed (ICCV 2023)
Ground-Fusion: A Low-cost Ground SLAM System Robust to Corner Cases (ICRA2024)
Repository to generate in real-time Situational Graphs (S-Graphs) for robot pose and map optimization using 3D LiDAR Data
AnyCalib: On-Manifold Learning for Model-Agnostic Single-View Camera Calibration (ICCV 2025)
Affordance-based Robot Manipulation with Flow Matching
Release repo for our SLAM Handbook
A Comprehensive Framework for Visual SLAM Systems and Datasets
[IEEE T-RO 2025] iKalibr: Multi-Sensor Calibration (Extrinsics & Time Offsets)
A comprehensive list of Implicit Representations and NeRF papers relating to Robotics/RL domain, including papers, codes, and related websites
[CVPR'24] Group Anything with Radiance Fields
Code for "Robot See Robot Do" presented at CoRL 2024!
Ultra fast 3D reconstruction and novel view synthesis.