Latest Issue
Empowering Education with Online Khmer Handwritten Text Recognition for Teaching and Learning Assistance
Published: August 30,2025Undergraduate Student Dropout Prediction with Class Balancing Techniques
Published: August 30,2025Status of Seawater Quality at Koh Rong Island, Sihanoukville, Cambodia
Published: August 30,2025Low-Complexity Detection of Primary Synchronization Signal for 5G New Radio Terrestrial Cellular System
Published: August 30,2025Word Spotting on Khmer Printed Documents
Published: August 30,2025Tuning Hyperparameters Learning Rate and Gamma in Gym Environment Inverted Pendulum
Published: August 30,2025Examining Passenger Loyalty in Phnom Penh Public Bus System: A Structural Equation Modelling Approach
Published: August 30,2025Prediction on Load model for future load profile of Electric Vehicle charging demand in Phnom Penh
Published: August 30,2025Economic Study on Integrating PV-DG with Grid-Tie: Case Study in Cambodia
Published: August 30,2025Simultaneous Localization and Mapping Using Intel RealSense Camera
-
1. ITC
Academic Editor:
Received: January 22,2024 / Revised: / Accepted: January 22,2024 / Available online: June 01,2020
This paper provides Simultaneous Localization and Mapping (SLAM) for generating a 3D map of an environment. It takes the approach of graph-based SLAM with loop closure detection in the use of RGB-D (Red, Green, Blue and Depth) camera to generate the 3D map of an unknown environment. In real-time applications, localization and mapping require both accuracy and robustness. Thus, in this paper, a lighter weight Intel RealSense d435i RGB-D (with build-in IMU) camera is chosen as the sensor. The data from Intel RealSense d435i camera is computed by Jetson Nano, a single board computer which runs Robotic Operating System (ROS) and extracted to RTAB-Map node (Real-Time Appearance-Based Mapping) in ROS environment to perform the SLAM. The RTAB-Map uses the RGB data along with depth and IMU information to build the 3D map of an environment. RGB image data and depth data with IMU data are computed to get key features such as dense point cloud, and depth of RGB pixels. These features in different scenes are matched to calculate the motion of the camera. This can lead to find the odometry of the camera and construct the 3D map of the surrounding environment. Particularly, the result in this paper was conducted indoor and outdoor environment and compared to observe the difference of the quality of both environments by using Rviz simulator based on ROS to visualize the 3D map in real-time.