rs_D455 camera internal and external parameter calibration + imu joint calibration

IMU calibration

<launch>

<node pkg="imu_utils" type="imu_an" name="imu_an" output="screen">

<!--TOPIC The name is the same as above-->

<param name="imu_topic" type="string" value= "/camera/imu"/>

<!--imu_name indifferent-->

<param name="imu_name" type="string" value= "d435i"/>

<!--Storage path of calibration results-->

<param name="data_save_path" type="string" value= "$(find imu_utils)/data/"/>

<!--Data recording time-min-->

<param name="max_time_min" type="int" value= "120"/>

<!--Sampling frequency, i.e IMU Frequency, sampling frequency can be used rostopic hz /camera/imu View, set to 400, for the following rosbag play Playback frequency-->

<param name="max_cluster" type="int" value= "400"/>

</node>

</launch>

Original link: https://blog.csdn.net/xiaoxiaoyikesu/article/details/105646064

Refer to elder martial sister's article Calibration of internal and external parameters of rs camera + imu (1-11)

Monocular calibration

Calibration procedure:

Try to stay still and move slowly

Up and down three times, left and right three times, flip three times, translate up and down three times, translate left and right three times, front and back three times, and move randomly for many times (arbitrary)

1.kalibr tool generates calibration board, (checkerboard needs to be generated):

kalibr_create_target_pdf --type 'apriltag' --nx 6 --ny 6 --tsize 0.08 --tspace 0.3

2. Download & print calibration board

https://github.com/ethz-asl/kalibr/wiki/downloads

https://github.com/ethz-asl/kalibr/wiki/calibration-targets (three parameters: chekboard, circlegrid and aprilgrid. Aprilgrid is used)

target_type: 'aprilgrid' #gridtype

tagCols: 6 #number of apriltags

tagRows: 6 #number of apriltags

tagSize: 0.0244 #size of apriltag, edge to edge [m]

tagSpacing: 0.00286 #ratio of space between tags to tagSize

#example: tagSize=2m, spacing=0.5m --> tagSpacing=0.25[-]

3. Determine suitable distance

roslaunch realsense2_camera rs_camera.launch

rviz

Select camera from the Fixed Frame on the left_ link

Add -- > by topic -- > / Camera / color / image in the lower left corner_ Raw / -- > double click Camera to find a suitable distance to capture the chessboard

close

4. Modify the number of camera frames

rosrun topic_tools throttle messages /camera/color/image_raw 4.0 /color

Modify the number of camera frames to 4hz. Here a new topic is used to publish: / color step 5 record and write to / color topic.

5. Record ROS packet

rosbag record -O camd455i /color

Just record for one minute. When recording, move towards the calibration board, press ctrl+c to end the recording, and camd455i will appear when you see the current folder Bag packet.

6. Calibrate with Kalibr (the previous steps are carried out on tx2, and this step is carried out on notebook computer)

In the Kalibr workspace in the newly established ROS, execute source devel / setup SH, execute:

kalibr_calibrate_cameras --target aprilgrid.yaml --bag camd455i.bag --models pinhole-radtan --topics /color --show-extraction

Pinhole radtan refers to camera model and distortion model

After calibration, you can find three files in the current directory, which are the results of calibration:

camchain-...camd435i.yaml

results-cam-...camd435i.txt

report-cam-...camd435i.pdf

When shooting, don't change the angle too much, otherwise the initialization will fail and the divergence will be optimized. ([ERROR] Did not converge in maxIterations... restarting...). Because a good estimate must be obtained during initialization, but if there are too large angle images, it is easy to get the wrong initial value, so the subsequent optimization will fail.

Calibration parameters are:

cam0:

cam_overlaps: []

camera_model: pinhole

distortion_coeffs: [-0.13104112755856126, 0.020606715483751877, -0.002771690056311741,

0.001163213320750032]

distortion_model: radtan

intrinsics: [260.9278068132713, 263.94361004788766, 301.7212486187008, 199.01710004608793]

resolution: [640, 480]

rostopic: /color

IMU + monocular camera calibration

Official tutorial: https://github.com/ethz-asl/kalibr/wiki/camera-imu-calibration

After calibrating camera and IMU, get the corresponding yaml file (i.e. the parameters of camera and IMU respectively)

Calibration procedure:

Up and down three times, left and right three times, flip three times, translate up and down three times, translate left and right three times, front and back three times, and move randomly for many times (arbitrary)

1. Adjust the frame rate

The camera is 20Hz and imu200hz, and they are published under the topic names of / color and / IMU respectively

rosrun topic_tools throttle messages /camera/color/image_raw 20.0 /color

rosrun topic_tools throttle messages /camera/imu 200.0 /imu

2. Record packets

First move the directory to the newly created kalibr working environment directory.

Then execute source devel / setup sh

rosbag record -b 4096 -O dynamic /color /imu

3. Rewrite yaml file

Calibration requires three files, one is the camera calibration file, one is the IMU calibration file, and the other is the recorded data packet


It is also calibrated with kalibr, camchain-camd435i Yaml, so you can directly use the camera yaml file:

(1) yaml file for camera calibration

cam0:

cam_overlaps: []

camera_model: pinhole

distortion_coeffs: [-0.14627678771168612, 0.031132819617662677, -0.0016199154527738965,

-0.01257776985511912]

distortion_model: radtan

intrinsics: [259.05353479443266, 256.15264741602005, 290.7955146414971, 234.21114661849504]

resolution: [640, 480]

rostopic: /color

(2) New IMU yaml

Create a new IMU in the kalibr working environment directory Yaml file:

#Accelerometers

accelerometer_noise_density: 2.0477290485501922e-02 #Noise density (continuous-time)

accelerometer_random_walk: 4.2308969579290693e-04 #Bias random walk

#Gyroscopes

gyroscope_noise_density: 2.2488785808195085e-03 #Noise density (continuous-time)

gyroscope_random_walk: 1.5385085422768701e-05 #Bias random walk

rostopic: /imu #the IMU ROS topic

update_rate: 200.0 #Hz (for discretization of the values above)

The following IMU calibration parameters are known, and fill the results in the above IMU accordingly In yaml:

Gyr:

avg-axis:

gyr_n: 2.2488785808195085e-03

gyr_w: 1.5385085422768701e-05

Acc:

avg-axis:

acc_n: 2.0477290485501922e-02

acc_w: 4.2308969579290693e-04

4. Calibrate with Kalibr

Execute source devel / setup.exe under Kalibr working directory sh

Execute instructions:

kalibr_calibrate_imu_camera --target checkerboard.yaml --cam camd455i.yaml --imu imu.yaml --bag dynamic.bag --show-extraction

kalibr_calibrate_imu_camera \

--target aprilgrid.yaml \

--cam camchain-camd455i.yaml \

--imu imu.yaml \

--bag imu_cam.bag \

--show-extraction

It can be recorded for 2 minutes, and the generated file name is:

camchain-imucam-dynamic.yaml

Where T_cam_imu is the external parameter we need.

Error:

ImportError: No module named scipy.optimize

terms of settlement:

sudo apt-get install python-scipy

After this, you can wait patiently for the results...

Transformation (cam0):

-----------------------

T_ci: (imu0 to cam0):

[[ 0.99579733 0.0648917 0.06462771 -0.01794528]

[-0.06212431 0.99710033 -0.04394886 0.01608132]

[-0.06729222 0.03974921 0.9969412 -0.07784444]

[ 0. 0. 0. 1. ]]

T_ic: (cam0 to imu0):

[[ 0.99579733 -0.06212431 -0.06729222 0.01363058]

[ 0.0648917 0.99710033 0.03974921 -0.01177593]

[ 0.06462771 -0.04394886 0.9969412 0.07947285]

[ 0. 0. 0. 1. ]]

Binocular calibration

Step 1: Download and print calibration board or purchase calibration board

reach https://github.com/ethz-asl/kalibr/wiki/downloads choice

Download, then zoom to 40%, and print it on A4 paper

The lattice parameters of the original pdf are:

6 * 6 grid

Side length of large grid: 5.5cm

Side length of small lattice: 1.65cm

Side length ratio of small grid and large grid: 0.3

The adjusted grid parameters are:

Side length of large grid: 2.2cm

Side length of small lattice: 0.66cm

Side length ratio of small grid and large grid: 0.3

But this is only an ideal situation, and the actual situation has to be measured.

Create a new aprilgrid The format of yaml file refers to yaml in the above figure, and the content is shown as follows:

target_type: 'aprilgrid' #gridtype

tagCols: 6 #number of apriltags

tagRows: 6 #number of apriltags

tagSize: 0.0244 #size of apriltag, edge to edge [m]

tagSpacing: 0.00244 #ratio of space between tags to tagSize

#example: tagSize=2m, spacing=0.5m --> tagSpacing=0.25[-]

You must measure the side length of the large grid by yourself, that is, tagSize. I measured that it is actually 0.024, so the tagSize in my file is 0.024. Please change it yourself.

Step 2: turn on and off the structured light

When starting structured light by default, the binocular image will have many points, which may affect the calibration, so you need to turn off structured light when using.

Start first

roslaunch realsense2_camera rs_camera.launch

Newly open the terminal and run

rosrun rqt_reconfigure rqt_reconfigure

Open camera - > stereo_ Emitter in module_ Set enabled to off(0), as shown below:

Step 3: determine the appropriate location of realsense

Open a new terminal and run rviz

rviz

Then add rgb and the topic corresponding to the binocular,

/camera/color/image_raw,

/camera/infra1/image_rect_raw

/camera/infra2/image_rect_raw

Then align the calibration plate, try to move the realsense, and ensure that the calibration plate is always in the three images. Recording process reference https://www.youtube.com/watch?v=puNXsnrYWTY&app=desktop

Step 4: modify the number of camera frames (the official recommendation is 4Hz. Although the actual frequency is not completely accurate, it will not affect the result)

When processing calibration data, kalibr requires that the frequency should not be too high, usually 4Hz. We can use the following command to change the frequency of topic. In fact, the original topic is converted to a new topic at a new frequency. The actual test infra1 is the corresponding left eye camera

 

rosrun topic_tools throttle messages /camera/color/image_raw 4.0 /color

rosrun topic_tools throttle messages /camera/infra1/image_rect_raw 4.0 /infra_left

rosrun topic_tools throttle messages /camera/infra2/image_rect_raw 4.0 /infra_right

Note: this method may cause the time of different cameras to be out of sync. If this problem occurs, you can try not to do this operation. Failure to do so means that more processing time is required. In this case, corresponding changes should be made later.

Step 5: record ROS packet

 

rosbag record -O multicameras_calibration /infra_left /infra_right /color

The last three topics are the topics after frequency conversion

Step 6: calibrate with Kalibr

Activate the environment variable first

 

source devel/setup.bash

then

 

kalibr_calibrate_cameras \

--target aprilgrid.yaml \

--bag multicameras_calibration.bag \

--models pinhole-equi pinhole-equi pinhole-equi \

--topics /infra_left /infra_right /color \

--bag-from-to 10 80 \

--show-extraction \

Where – target aprilgrid Yaml is the configuration file of the calibration board. Note that if you select a checkerboard, note that targetCols and targetRows represent the number of inner corners, not the number of squares bag multicameras_calibration.bag is a recorded data packet. Models pinhole equi pinhole equi pinhole equi represents the camera model and distortion model of the three cameras (for explanation and reference) https://github.com/ethz-asl/kalibr/wiki/supported-models , select as needed), -- topics /infra_left /infra_right /color indicates the data topic captured by the three cameras, and – bag from to 10 100 indicates processing the data in the bag for 10-100 seconds Show extraction indicates the process of displaying and detecting feature points, and these parameters can be adjusted accordingly.

The following errors may occur:

1. Error: camera are not connected through mutual observations, please check the dataset, may adjust the approval. Sync tolerance.

Solution: the reason should be that the data of each camera is not synchronized. The solution is to add – approx sync 0.04 in the calibration command, and the changed calibration command is:

 

kalibr_calibrate_cameras \

--target aprilgrid.yaml \

--bag multicameras_calibration.bag \

--models pinhole-equi pinhole-equi pinhole-equi \

--topics /infra_left /infra_right /color \

--bag-from-to 10 80 \

--show-extraction \

--approx-sync 0.04

Among them, 0.04 can be adjusted to 0.1 according to the situation. If it is still not possible, there may be a problem with the recorded data, which needs to be recorded again to ensure that the calibration board is always in the image and the movement is not too intense. If it still doesn't work, try not to adjust the topic frequency

reference resources https://blog.csdn.net/HUST_lc/article/details/96144499 and https://blog.csdn.net/HUST_lc/article/details/96144499 comment

3. The following error occurs when using the checkerboard calibration board

 

Did not converge in maxIterations... restarting...

Wait more than ten minutes......

The final results are three documents:

Binocular + IMU calibration (similar to monocular)

Step 1: write camchain yaml

Refer to Kalibr official tutorial for format https://github.com/ethz-asl/kalibr/wiki/yaml-formats Chain in Yaml. Refer to the yaml file obtained above for specific parameters. No parameters can be deleted. The final result example is as follows:

cam0:

camera_model: pinhole

intrinsics: [425.9022457154398, 425.51203910141476, 320.7722078245714, 234.52032563515024]

distortion_model: equidistant

distortion_coeffs: [0.28012961504219924, 0.776522889083524, -3.555039585283302, 7.122425751506347]

rostopic: /infra_left

resolution: [640, 480]

cam1:

camera_model: pinhole

intrinsics: [430.5354779160687, 429.81266841318336, 318.5487581769465, 232.5397023354142]

distortion_model: equidistant

distortion_coeffs: [0.2913961623966211, 0.7172454259365787, -3.7209659434658295, 7.615803448415106]

T_cn_cnm1:

[ 0.9999886252230528, 7.228145915638569e-05, -0.004769087951971224,0.054624373296219914]

[-6.78405772070143e-05, 0.9999995640055901, 0.0009313357504042449, 0.002552511727883378]

[0.004769153190982554,-0.0009310016189884111, 0.9999881941372211,0.004273862007202206]

[0.0, 0.0, 0.0, 1.0]

rostopic: /infra_right

resolution: [640, 480]

Step 2: write IMU yaml,

Format reference https://github.com/ethz-asl/kalibr/wiki/yaml-formats imu in Yaml, the specific parameters use the parameters calibrated by imu before, for example:

#Accelerometers

accelerometer_noise_density: 2.8250053766610776e-02 #Noise density (continuous-time)

accelerometer_random_walk: 7.8925155899657628e-04 #Bias random walk

#Gyroscopes

gyroscope_noise_density: 2.3539521240749008e-03 #Noise density (continuous-time)

gyroscope_random_walk: 2.2003805724014335e-05 #Bias random walk

rostopic: /imu #the IMU ROS topic

update_rate: 200.0 #Hz (for discretization of the values above)

Step 3: prepare the previous aprilgrid Yaml, for example:

target_type: 'aprilgrid' #gridtype

tagCols: 6 #number of apriltags

tagRows: 6 #number of apriltags

tagSize: 0.0244 #size of apriltag, edge to edge [m]

tagSpacing: 0.00299 #ratio of space between tags to tagSize

#example: tagSize=2m, spacing=0.5m --> tagSpacing=0.25[-]

Step 4: copy RS in realsense ROS package_ camera. Launch, renamed rs_imu_stereo.launch, change to

take

<arg name="enable_sync" default="false"/>

Replace with:

<arg name="enable_sync" default="true"/>

This aligns the imu and binocular data in time

take

<arg name="unite_imu_method" default=""/>

Change to

<arg name="unite_imu_method" default="linear_interpolation"/>

This ensures that there will be imu topics

Step 5: start realsense

roslaunch realsense2_camera rs_imu_stereo.launch

Step 6: turn off IR structured light, refer to the above

rosrun rqt_reconfigure rqt_reconfigure 

Open camera - > stereo_ Emitter in module_ Set enabled to off(0), as shown below:

Step 7: open rviz, add imu topic, infra1 topic and infra2 topic, and adjust the realsense position at the same time,

Ensure that the binocular image data always contains all the contents of the calibration board

Step 8: adjust the publishing frequency of imu and binocular topic and publish them with a new topic name

The release frequency of binocular image is changed to 20Hz and imu release frequency is changed to 200Hz

rosrun topic_tools throttle messages /camera/infra1/image_rect_raw 20.0 /infra_left

rosrun topic_tools throttle messages /camera/infra2/image_rect_raw 20.0 /infra_right

rosrun topic_tools throttle messages /camera/imu 200.0 /imu

Note: this method of frequency adjustment is only an ideal result. You can view the actual frequency through the rostopic hz topic name. It can be found that the actual frequency and the set frequency are not necessarily the same, but you can do this first. If you know how to adjust, please let me know.

Step 9: start recording data packets,

Recording process reference [https:// www.youtube. com/watch? V = punxsnrywty & app = desktop], also note that the binocular image should include the whole calibration board in the whole process. At the same time, it should not move too fast, which will cause the image to be too blurred. It will move back and forth in the front, back, left, up and down directions. It can be recorded for about 90 seconds, and it can not be used after 15 seconds. The recording command is:

rosbag record -O imu_stereo.bag /infra_left /infra_right /imu

Step 10: start calibration

The calibration command is

kalibr_calibrate_imu_camera --bag [filename.bag] --cam [camchain.yaml] --imu [imu.yaml] --target [target.yaml] --bag-from-to 15 75 --show-extraction

kalibr_calibrate_imu_camera --bag imu_stereo.bag --cam camchain.yaml --imu imu.yaml --target aprilgrid.yaml --bag-from-to 15 75 --show-extraction

The corresponding parameters need to be changed accordingly, target Yaml corresponds to april_6x6_A4.yaml file

The final result is to type yaml, txt, and pdf files

Flight control IMU+D455 calibration

D455 camera IMU data:

T_ic: (cam0 to imu0):

[[-0.07683649 -0.05913403 0.99528856 -0.00011369]

[-0.98555081 0.15563515 -0.06683784 -0.00001289]

[-0.1509495 -0.98604303 -0.07023805 0.00004632]

[ 0. 0. 0. 1. ]]

Record and calibrate rgb camera with realsense viewer

Step 1: open realsense and set the bag record saving path. The settings are as follows: after setting the resolution and sampling frequency of the rbg camera, press record to record the data for about 90 seconds

Step 2: calibrate with Kalibr

Activate the environment variable first

source devel/setup.bash

then

kalibr_calibrate_cameras --target ~/project/calibration/rgb/april_6x6_A4.yaml --bag ~/disk/datasets/realsense-dataset/calibration/rgb/20200526_131810.bag --models pinhole-equi --topics /device_0/sensor_1/Color_0/image/data --bag-from-to 20 80 --show-extraction

Where – target aprilgrid Yaml is the configuration file of the calibration board. Note that if you select a checkerboard, note that targetCols and targetRows represent the number of inner corners, not the number of squares bag imu_stereo.bag is a recorded data packet. Models pinhole equi pinhole equi pinhole equi represents the camera model and distortion model of the three cameras (for explanation and reference) https://github.com/ethz-asl/kalibr/wiki/supported-models , select as needed), -- topics /device_0/sensor_1/Color_0/image/data refers to the rgb camera data taken, and – bag from to 20 80 refers to processing the data in the bag for 20-80 seconds Show extraction indicates the process of displaying and detecting feature points, and these parameters can be adjusted accordingly.

be careful:

Pinhole radtan: the most common pinhole model + Brownian distortion model, which is applicable to most cameras with an angle less than 120. The distortion parameters include radial distortion k1,k2 and tangential distortion p1,p2; If the camera distortion is not very serious, this model is basically OK; For example, my DOV 150 camera can also use this, and the distortion removal effect is very good;

Pinhole equi: pinhole model + isometric distortion model, that is, the type to be selected for KB model. This model has a wide range of applications, and most fisheye lenses can also be used. Note that the distortion parameters of KB model with 8 parameters are K1, K2, K3 and K4. Although they are also four numbers, different from the previous model, there are only radial distortion parameters, Without tangential distortion, the corresponding formulas are also different; This is also the model used by cv::fisheye in opencv;

Keywords: Autonomous vehicles

Added by weedo on Wed, 19 Jan 2022 21:24:39 +0200