Latest Issue
THE 13TH SCIENTIFIC DAY (Catalyzing Innovation : Human Capital, Research, and Industry Linkages)
Published: August 23,2024Earth Resources and Geo-Environment Technology
Published: August 20,2024Word Spotting on Khmer Palm Leaf Manuscript Documents
Published: June 30,2024Text Image Reconstruction and Reparation for Khmer Historical Document
Published: June 30,2024Enhancing the Accuracy and Reliability of Docker Image Vulnerability Scanning Technology
Published: June 30,2024Walkability and Importance Assessment of Pedestrian Facilities in Phnom Penh City
Published: June 30,2024Assessment of Proximate Chemical Composition of Cambodian Rice Varieties
Published: June 30,2024Mobile Robot Localization using Extended Kalman Filter with Kinematic Model
-
1. Dynamics and Control Laboratory, Department of Industrial and Mechanical Engineering, Institute of Technology of Cambodia, Russian Federation Blvd., P.O. Box 86, Phnom Penh, Cambodia.
Received: July 19,2021 / Revised: Accepted: November 19,2021 / Published: December 30,2021
In mobile robotic field, one of the most important tasks in mobile robot navigation is Robot Localization. Localization is the task of determining the location of the robot inside the environment at a specific time step. To obtain the position and the orientation of the robot, various sensors have been utilized along with implementation of numerous kinds of algorithms. In this paper, the sensor fusion based on Extended Kalman Filter (EKF) algorithm is proposed. Differential drive wheeled mobile robot kinematic model is derived. The robot is equipped with an inertial measurement unit (IMU), wheel encoders, and a light detecting and ranging sensor (Lidar). The noise of the sensor data is assumed to be gaussian white noise. In the prediction step, the robot's linear velocity and angular velocity are determined by using the information from wheel encoders and subsequently is used to compute the robot pose. In the correction step, the robot pose is updated from the information from IMU and Scan Matching Lidar. To control the robot movement, a trajectory control algorithm based on a backstepping controller is used. The robot is controlled to move in two trajectories: circular and “8” shape. The robot position and orientation are represented in two dimensional cartesian coordinate system. The numerical experiment is conducted inside the simulation software Gazebo and Robotic Operating System (ROS) framework. Finally, the numerical experiment results showed that the sensor fusion algorithm is effective in estimating the robot pose against the desired trajectory.