Latest Issue
Study on Mechanical Structure Design for Plug-and-play Wheel Mobile Robot
Published: December 31,2023PI Controller for Velocity Controller Design based on Lumped Parameter Estimation: Simulation and Experiment
Published: December 31,2023Attitude Estimation by using Unscented Kalman Filter with Constraint State
Published: December 31,2023Characterization Study of Cambodian Natural Rubber and Clay Composites for Shock Absorption Floor Mat
Published: December 31,2023Selection of Observed Gridded Rainfall Data for different Analysis Purposes in Cambodia
Published: December 31,2023An Empirical Investigation of Gold Price Forecasting Using ARIMA Compare with LSTM Model
Published: December 31,2023Prediction of California Bearing Ratio with Soil Properties of Road Subgrade Materials in Cambodia
Published: December 31,2023Non-intrusive Load Monitoring Classification Based on Multi-Scale Electrical Appliance Load Signature
Published: December 31,2023Development of Control Framework Based on ROS Platform for a 3-Axis Gimbal
Published: December 31,2023Mobile Robot Localization using Extended Kalman Filter with Kinematic Model
-
1. Dynamics and Control Laboratory, Department of Industrial and Mechanical Engineering, Institute of Technology of Cambodia, Russian Federation Blvd., P.O. Box 86, Phnom Penh, Cambodia.
Received: July 19,2021 / Revised: Accepted: November 19,2021 / Published: December 30,2021
In mobile robotic field, one of the most important tasks in mobile robot navigation is Robot Localization. Localization is the task of determining the location of the robot inside the environment at a specific time step. To obtain the position and the orientation of the robot, various sensors have been utilized along with implementation of numerous kinds of algorithms. In this paper, the sensor fusion based on Extended Kalman Filter (EKF) algorithm is proposed. Differential drive wheeled mobile robot kinematic model is derived. The robot is equipped with an inertial measurement unit (IMU), wheel encoders, and a light detecting and ranging sensor (Lidar). The noise of the sensor data is assumed to be gaussian white noise. In the prediction step, the robot's linear velocity and angular velocity are determined by using the information from wheel encoders and subsequently is used to compute the robot pose. In the correction step, the robot pose is updated from the information from IMU and Scan Matching Lidar. To control the robot movement, a trajectory control algorithm based on a backstepping controller is used. The robot is controlled to move in two trajectories: circular and “8” shape. The robot position and orientation are represented in two dimensional cartesian coordinate system. The numerical experiment is conducted inside the simulation software Gazebo and Robotic Operating System (ROS) framework. Finally, the numerical experiment results showed that the sensor fusion algorithm is effective in estimating the robot pose against the desired trajectory.