Pose estimation for feedback control in a snake robot.
Thesis DisciplineMechanical Engineering
Degree GrantorUniversity of Canterbury
Degree NameMaster of Engineering
This thesis presents a pose estimation algorithm for a snake robot as a step towards creating a closed-loop controlled snake robot. The University of Canterbury snake robot is being developed for use in urban search and rescue, where snake robots have the advantage over wheeled robots that they are able to navigate through complex environments. For the snake robot to be able to make intelligent control decisions, feedback is required from the snake robot using sensors. Sensor data from an IMU and motor encoder is collected from each module through a data bus from a base station connected to a PC. The PC uses the Robot Operating System framework to run software for the pose estimation algorithm. The sensor data is collected through the serial node running on the base station, which is then sent to the pose estimation node running the pose estimation algorithm.
The pose estimation algorithm predicts the pose of each module of the snake robot and the orientation of the snake robot as a whole. A square root spherically simplex unscented Kalman filter was used due to the highly non-linear measurement model used. A virtual chassis was used to abstract away from the internal shape of the robot to allow it to be treated as though it were a wheeled robot. The linear progression, turning, rolling and rotating gaits were all used to test the pose estimation algorithm with a gait based model used for each. An additional joint angle model which does not require prior knowledge of the gait being performed was created to compare the two different approaches. The motion gaits were tested using the Vicon motion tracking system to compare the predicted angle of each module to the ground truth from the Vicon system.
The inchworm gait using the gait parameters model had a mean roll error of 1.65 degrees and a mean pitch error of 1.82 degrees compared to the joint angles model with a mean roll error of 1.84 degrees and a mean pitch error of 2.12 degrees. The rotating gait using the gait parameters model had a mean roll error of 2.34 degrees and a mean pitch error of 2.59 degrees compared to the joint angles model with a mean roll error of 2.58 degrees and a mean pitch error of 2.51 degrees. The yaw prediction was inaccurate due to being based on a magnetometer located in the head module which was adversely affected by magnetic interference from the building.
The pose estimation algorithm was designed to be redundant so that if a module fails, the algorithm is still accurate. To test this, the sensors on module 4 were set to not return sensor data. The mean roll and pitch errors for module 4 using the rotating gait with the gait parameters model were 3.02 degrees and 2.62 degrees respectively, only slightly worse than with no sensor failure. To simulate a hardware failure, the motor in module 4 was set to a constant 0 degrees while using the rotating gait. The mean prediction error of module 4 was slightly higher with a mean roll error of 4.42 degrees and a mean pitch error of 4.33 degrees. The head and tail modules were similarly affected by the hardware failure with slightly increased mean errors.
It was found that the accuracy of the joint angle model is very similar to the gait parameters model. As the joint angle model can perform all of the motion gaits tested with the same process model, since it is not reliant on prior knowledge of which gait pattern is being used, it is the more practical choice of model as future research is conducted.