0. Preface
I have nothing to do recently. I'm thinking of doing some work. I happened to see it yuanguobin01 The author wrote a series of articles on the improvement ideas of LEGO loam. After reading this part, it is regrettable that the author only provided some preliminary ideas without systematic learning code. Therefore, this paper intends to give its own strategic ideas from several improvement points proposed by the author.
1. 2D wheel odometer + IMU = 3D odometer replaces the original 3D Laser Front-end odometer
In this part, the author said that the displacement + IMU heading and attitude module provided by the two-dimensional odometer provides three-dimensional angular projection into a three-dimensional wheeled IMU. The mileage calculation rate is very low and the implementation is very convenient. It is very suitable for the operation of three-dimensional wheel odometer. Therefore, this paper directly gives geometry_msgs::TwistStamped Partial operation, personal feeling and use IMU median filter . It can make the whole structure more compact.
void wheelHandler(const geometry_msgs::TwistStampedConstPtr &wheel_msg) { using Eigen::Vector3d; if (wheel_msg->header.stamp.toSec() <= last_wheel_t) { ROS_WARN("wheel message in disorder!"); return; } double t = wheel_msg->header.stamp.toSec(); last_wheel_t = t; double vx = wheel_msg->twist.linear.x; double vy = wheel_msg->twist.linear.y; double vz = wheel_msg->twist.linear.z; double rx = wheel_msg->twist.angular.x; double ry = wheel_msg->twist.angular.y; double rz = wheel_msg->twist.angular.z; Vector3d vel(vx, vy, vz); Vector3d gyr(rx, ry, rz); inputWheel(t, vel, gyr); if (init_wheel) { latest_time = t; init_wheel = 0; return; } double dt = t - latest_time; latest_time = t; tmp_Q = tmp_Q * Utility::deltaQ(gyr * dt); tmp_P = tmp_P + tmp_Q.toRotationMatrix() * vel * dt; tmp_V = vel; nav_msgs::Odometry wheelOdometry; wheelOdometry.header.frame_id = "/camera_init"; wheelOdometry.child_frame_id = "/laser_odom"; wheelOdometry.header.stamp = ros::Time().fromSec(t); wheelOdometry.pose.pose.orientation.x = tmp_Q.x(); wheelOdometry.pose.pose.orientation.y = tmp_Q.y(); wheelOdometry.pose.pose.orientation.z = tmp_Q.z(); wheelOdometry.pose.pose.orientation.w = tmp_Q.w(); wheelOdometry.pose.pose.position.x = tmp_P.x(); wheelOdometry.pose.pose.position.y = tmp_P.y(); wheelOdometry.pose.pose.position.z = tmp_P.z(); pubWheelOdometry.publish(wheelOdometry); geometry_msgs::PoseStamped wheelPose; wheelPose.header = wheelOdometry.header; wheelPose.pose = wheelOdometry.pose.pose; wheelPath.header.stamp = wheelOdometry.header.stamp; wheelPath.poses.push_back(wheelPose); wheelPath.header.frame_id = "/camera_init"; pubWheelPath.publish(wheelPath); if (saveWheelOdo) { std::ofstream founW("xxx/results/wheel_odo.txt", std::ios::app); founW.setf(std::ios::fixed, std::ios::floatfield); founW.precision(5); founW << wheelOdometry.header.stamp.toSec() << " "; founW.precision(5); founW << wheelOdometry.pose.pose.position.x << " " << wheelOdometry.pose.pose.position.y << " " << wheelOdometry.pose.pose.position.z << " " << wheelOdometry.pose.pose.orientation.x << " " << wheelOdometry.pose.pose.orientation.y << " " << wheelOdometry.pose.pose.orientation.z << " " << wheelOdometry.pose.pose.orientation.w << std::endl; founW.close(); } return; }
2. Add GPS module
In this part, the author said that the addition of GPS factor focuses on improving the mapping accuracy and long-term robustness, and can also be used as loop detection. This part can indeed bring good benefits to the logo-logo system. We can take a look at this part first Coordinate system conversion
void laserOdometryHandler(const nav_msgs::Odometry::ConstPtr& laserOdometry) { // *********************************Laser odometer -- > Global pose*********************************** // 0. The inertial navigation system of Beidou Xingtong is northeast earth, and the coordinate system of geodetic coordinate conversion function is northeast sky. Now the inertial navigation system is processed and transformed into northeast sky // The xyz of geodetic coordinates and rpy of inertial navigation output are unified into the northeast sky coordinate system. // 1. Laser system -- > vehicle system: a translation, regardless of inertial navigation installation, because the transformation from the vehicle system of the first frame to the northeast sky coordinate system can be known directly from the RTK message of the first frame // Consistent with the ros standard, the forward direction is set to x and the upward direction is set to z by default // 2. Vehicle system -- > northeast sky: coordinate transformation, obtained by xyzrpy transformation of the first frame. // ************************************************************************************************ Eigen::Isometry3d mLidar = Eigen::Isometry3d::Identity(); // Assign the current pose to mBody mLidar.rotate(Eigen::Quaterniond( laserOdometry2.pose.pose.orientation.x, laserOdometry2.pose.pose.orientation.y, laserOdometry2.pose.pose.orientation.z, laserOdometry2.pose.pose.orientation.w )); mLidar.translate(Eigen::Vector3d( laserOdometry2.pose.pose.position.x, // Cut off the laser directly laserOdometry2.pose.pose.position.y, laserOdometry2.pose.pose.position.z )); // 1. mLidar --> mBody Eigen::Isometry3d tLidar2Body = Eigen::Isometry3d::Identity();//translation tLidar2Body.translate(Eigen::Vector3d(1.83,0,0));//rotate Eigen::Isometry3d mBody = tLidar2Body*mLidar; // 2. mBody --> ENU Eigen::Isometry3d mENU = tBody2ENU*mBody; // *********************************Global pose -- > laser odometer*********************************** // Body to the ENU origin, so it is a negative number mENU.translate(Eigen::Vector3d(RTK->x,RTK->y,RTK->Ati); // RPY of RTK measures the rotation of vehicle body relative to ENU coordinate system mENU.rotate(Eigen::Quaterniond(Eigen::AngleAxisd(RTK->yaw*PI/180.0,Eigen::Vector3d::UnitZ())* Eigen::AngleAxisd(RTK->pitch*PI/180.0,Eigen::Vector3d::UnitY())* Eigen::AngleAxisd(RTK->roll*PI/180.0,Eigen::Vector3d::UnitX()));//Euler angular quaternion // 1. ENU --> mBody Eigen::Isometry3d mBody = tBody2ENU.inverse()*mENU; // 2. mBody --> mLidar Eigen::Isometry3d mLidar = tLidar2Body.inverse()*mBody; } // It involves three coordinate systems: GPS, body and lidar // body and lidar differ by one translation void rtkHandler(const bdxt::rtk::ConstPtr& RTK) { if(!is_gps_init){ // gps initial value record if(RTK->navStatus == 4 && RTK->rtkStatus == 5){ // qBody2ENU calculates the rotation from the vehicle body coordinate system to the GPS (northeast sky) coordinate system // The imu reading that can be obtained is the value under the inertial navigation system (northeast earth), so it needs to be transformed: body -- > northeast earth -- > northeast sky // Two quaternions are involved: qbody2ned and qned2enu Eigen::Vector3d vBody2ENU; // gps initial pose: x,y,z,pitch,yaw,roll. It is also a static transformation of laser origin coordinate system and geodetic coordinate system Eigen::Quaterniond qBody2ENU; // Rotation transformation from body to ENU // RPY of RTK measures the rotation of vehicle body relative to ENU coordinate system Eigen::Quaterniond qBodyNED_; // The rotation of the Body relative to the NED coordinate system is actually NED2BODY qBodyNED_ = Eigen::AngleAxisd(RTK->yaw*PI/180.0,Eigen::Vector3d::UnitZ())* Eigen::AngleAxisd(RTK->pitch*PI/180.0,Eigen::Vector3d::UnitY())* Eigen::AngleAxisd(RTK->roll*PI/180.0,Eigen::Vector3d::UnitX());//Euler angular quaternion Eigen::Quaterniond qBody2NED = qBodyNED_.conjugate();//Inverse quaternion // Northeast sky -- > northeast earth, first rotate PI/2 on the z axis, and then rotate PI on the x axis. Pay attention to the order Eigen::Quaterniond qNED2ENU; qNED2ENU = Eigen::AngleAxisd(PI/2,Eigen::Vector3d::UnitZ())* Eigen::AngleAxisd(PI,Eigen::Vector3d::UnitX()); //Pay attention to the order and transform from right to left qBody2ENU = qNED2ENU*qBody2NED; // Body to the ENU origin, so it is a negative number vBody2ENU[0] = -RTK->x; // Set to 0 if the first frame is used as the coordinate origin coordinate conversion vBody2ENU[1] = -RTK->y; // Set to 0 if the first frame is used as the coordinate origin coordinate conversion vBody2ENU[2] = -RTK->Ati; // Set to 0 if the first frame is used as the coordinate origin coordinate conversion // The transformation matrix is obtained tBody2ENU.rotate(qBody2ENU); tBody2ENU.translate(vBody2ENU); is_gps_init = true; } else{ ROS_INFO("bad RTK status\n"); } } }
After the conversion, we can bring it in through EKF or Ceres and optimize it on the front end. For the back-end optimization mentioned by the author, please refer to the following documents, which is basically adding GPS factor to GTSAM
isamCurrentEstimate = isam->calculateEstimate();//This part can be replaced by RTK without real-time fusion. If RTK accuracy is considered sufficient double recentOptimizedX = lastOptimizedPose.translation().x(); double recentOptimizedY = lastOptimizedPose.translation().y(); double bigNoiseTolerentToXY = 1000000000.0; // 1e9 double gpsAltitudeNoiseScore = 250.0; // if height is misaligned after loop clsosing, use this value bigger gtsam::Vector robustNoiseVector3(3); // gps factor has 3 elements (xyz) robustNoiseVector3 << bigNoiseTolerentToXY, bigNoiseTolerentToXY, gpsAltitudeNoiseScore; // means only caring altitude here. (because LOAM-like-methods tends to be asymptotically flyging) robustGPSNoise = gtsam::noiseModel::Robust::Create( gtsam::noiseModel::mEstimator::Cauchy::Create(1), // optional: replacing Cauchy by DCS or GemanMcClure is okay but Cauchy is empirically good. gtsam::noiseModel::Diagonal::Variances(robustNoiseVector3) ); // Find the most appropriate GPS information. It's slow enough to be out of alignment double eps = 0.1; // Find a GPS topic in eps while (!gpsBuf.empty()) { auto thisGPS = gpsBuf.front(); auto thisGPSTime = thisGPS->header.stamp.toSec(); if( abs(thisGPSTime - timeLaserOdometry) < eps ) { currGPS = thisGPS; hasGPSforThisKF = true; break; } else { hasGPSforThisKF = false; } gpsBuf.pop(); } mBuf.unlock(); // gps factor const int curr_node_idx = keyframePoses.size() - 1; // Because the index starts from 0 (in fact, this index can be any number, but for simple implementation, we follow the sequential index) if(hasGPSforThisKF) { double curr_altitude_offseted = currGPS->altitude - gpsAltitudeInitOffset; mtxRecentPose.lock(); gtsam::Point3 gpsConstraint(recentOptimizedX, recentOptimizedY, curr_altitude_offseted); // In this example, only the height is adjusted (for x and y, a lot of noise is set) mtxRecentPose.unlock(); gtSAMgraph.add(gtsam::GPSFactor(curr_node_idx, gpsConstraint, robustGPSNoise)); cout << "GPS factor added at node " << curr_node_idx << endl; } initialEstimate.insert(curr_node_idx, poseTo);
3. Add scan context
This part can refer to what I wrote before Series articles.
This paper also gives four main methods of loop detection in LOAM series
- Traditional domain distance search + ICP matching
- Loop detection of coarse matching + ICP accurate matching based on scan context Series
- Loop detection based on scan context
- Loop detection based on Intensity scan context+ICP