Simultaneous Localization and Mapping Using Intel RealSense Camera
    1. ITC

Received: January 22,2024 / Revised: Accepted: January 22,2024 / Published: June 01,2020

Download PDF
Browse Figures
×

 This paper provides Simultaneous Localization and Mapping (SLAM) for generating a 3D map of an environment. It takes the approach of graph-based SLAM with loop closure detection in the use of RGB-D (Red, Green, Blue and Depth) camera to generate the 3D map of an unknown environment. In real-time applications, localization and mapping require both accuracy and robustness. Thus, in this paper, a lighter weight Intel RealSense d435i RGB-D (with build-in IMU) camera is chosen as the sensor. The data from Intel RealSense d435i camera is computed by Jetson Nano, a single board computer which runs Robotic Operating System (ROS) and extracted to RTAB-Map node (Real-Time Appearance-Based Mapping) in ROS environment to perform the SLAM. The RTAB-Map uses the RGB data along with depth and IMU information to build the 3D map of an environment. RGB image data and depth data with IMU data are computed to get key features such as dense point cloud, and depth of RGB pixels. These features in different scenes are matched to calculate the motion of the camera. This can lead to find the odometry of the camera and construct the 3D map of the surrounding environment. Particularly, the result in this paper was conducted indoor and outdoor environment and compared to observe the difference of the quality of both environments by using Rviz simulator based on ROS to visualize the 3D map in real-time.