Autonomous Machine SLAM and YOLO with Intel Realsense cameras and RPLidar

The presented rover is based on Pioneer AT9 hardware (differential drive, motor drivers controlled by Teensy 4.0). It was upgraded for autonomous systems experiments using Intel Realsense cameras D435 camera (stereo RGBD, 30 FPS), T265 (tracking, sending 6DOF pose estimation, 200Hz) and RPLidar 3 360 degrees scanner (10Hz).
All computation is based on an Intel NUC (i7 with 64GB RAM, Intel graphics). A future upgrade would include dedicated GPU.
SLAM (Simultaneous Localization And Mapping) is done using RTABmap from ROS. Used darknet (basic architecture for YOLOv3) for object detection (trained on the COCO dataset). As a future functionality I will add 3D visualization in ROS and RViz, and high frequency object position estimation in space.
Controlled via waypoints in RViz or manually using the Logitech F710 gamepad. Capable of visual servoing for no-waypoint autonomous navigation. Details on setting up hardware and software for such an autonomous rover in future videos.

Пікірлер: 4

  • @paulraj8216
    @paulraj8216 Жыл бұрын

    can you provide the slam implementation process in Git Hub.

  • @justtestingonce
    @justtestingonce4 ай бұрын

    Like how many camera’s do you need, crazy😂

  • @TULEYDIGITAL

    @TULEYDIGITAL

    Ай бұрын

    One camera handles SLAM all on its own hardware (T265). The other camera creates a pointcloud via depth (D345i)

  • @usamazaheer3109
    @usamazaheer31097 ай бұрын

    please provide the code :