Skip to content

IRCVLab/LiDAR_CAMERA_Calibration

Repository files navigation

Lidar-Camera Calibration

The LiDAR-Camera calibration code is based on:

Zhou, Lipu, Zimo Li, and Michael Kaess.
"Automatic extrinsic calibration of a camera and a 3D LiDAR using line and plane correspondences."
2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018.

While the original method required extracting corner lines, which could lead to limited accuracy, this implementation modifies the approach to use only normal vectors and plane constraints for calibration, thus improving robustness and reliability.

Setup

1. Install Conda Environment

conda env create -f environment.yaml
conda activate calibration

After running the notebook and executing the "Save the data" section, the folder structure should be as follows:

⚠️ Important:
Make sure to detect the calibration plane only within the ROI (Region of Interest).
Plane extraction must be performed using the ROI, as it is specifically defined to isolate the calibration target from irrelevant LiDAR points.

data/
├── Image/*.png
├── PCD/*.pcd
└── RoIPCD/*.pcd

Calibration

Run the run_gui.py, you should get the below reult GUI Result

1. Intrinsic Calibration

Click the Run Intrinsic Calibration

After running the code, a file named configs/example.yaml will be created. This file follows the standard ROS camera_info format.

configs/*.yaml

2. LiDAR Camera calibration

Next, click the LiDAR-Camera calibration While holding the Shift key, click and drag the red LiDAR bounding box to adjust its position. You only need to manually align the box to the checkerboard region.

Result

Sample result after successful calibration: Calibration Result To verify the result by overlaying the projected point clouds onto the images, run the following:

jupyter notebook scripts/cam_lidar_visualize.ipynb

In the notebook, navigate to the section "Mapping the point clouds to image", and update the result_path variable to match your saved output directory:

result_path = 'results/28-04-2025-20-44-40'

Notes

  • We used a Velodyne LiDAR sensor for this calibration setup.
  • Ensure your LiDAR point clouds and camera images are time-synchronized and spatially aligned for optimal results.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors