In the summer of 2019, I was interning at Inspired Automation Future Technologies, an Ahmedabad (India) based Robotics startup. They are working towards building India's first robotics museum at Science City, Ahmedabad. I was involved in two projects. The first one was on developing a 1/10th scale autonomous car.
RACECAR stands for Rapid Autonomous Complex-Environment Competing Ackermann-steering Robot. This project is a scaled-down version (1/10th) of a real Self-driving car. Moreover, the use of Ackermann steering for the drive mechanism unlike the differential drive mechanism used in most of the smaller version robots makes it a realistic robotic platform upon. The whole robot works on ROS(Robot Operating System) which is running on a Jetson TX2 platform. Most of the autonomous vehicle researchers are using these kinds of platforms for testing their algorithms especially after MIT open-sourced this project.
My task during the internship was to develop the platform from scratch and make it autonomously navigate in an indoor mapped environment. This vehicle would be used for demonstration purpose in the upcoming robotics museum. All the hardware mounts for sensors were designed and 3D printed. This was my first experience of using the Jetson development board. I have documented my work on the Github Wiki. I have included the introductory content here and the more detailed tutorial can be found on my GitHub page along with the codebase.
Here is the list of major components along with their links which were used in this project. In addition to this, a lot of 3D printed parts were used for making the mounts for different sensors-
- Traxxas Ford Fiesta Rally Car (RC car): This 1/10th model car was first opened up and then modified with the help of 3D printed parts for proper mounting of sensors and computation unit. A BLDC motor was ordered which has still not been installed in the car. A VESC controller will be used in future to control this BLDC motor directly via Jetson TX2. Currently, a DC motor is being used along with an ESC which is unable to provide any feedback. The major advantage of using VESC along with a BLDC motor will be that it will be able to provide feedback of the current motor RPM. The ESC for the longitudinal velocity and the servo for the steering control are already provided in the car itself. The connection from the radio receiver was removed and instead, they have been connected to an Arduino Nano. The following guides by JetsonHacks will be helpful for setting up the car- Jetson RACECAR
- Jetson TX2 (Onboard Computation Unit): The Jetson TX2 is used as the onboard computation device. The following page has the guidelines for installing all the necessary software on the Jetson TX2.
- RPLidar A2 (2D Lidar): This ROS compatible 2D Lidar is able to provide the 360-degree laser scan of the surroundings. It is able to publish data at the rate of 12Hz. In future, a faster Lidar- Hokuyo UST-10LX can be used if the RACECAR has to be made capable of running autonomously at high speeds.
- ZED Cam (Stereo Camera): All the vision part including lane detection is done with the help of this camera. In future SLAM(Simultaneous localization and mapping) can be implemented for generating 3D maps of the environment.
- MPU 6050 IMU (6 Axis IMU): The MPU 6050 consists of a 3-axis Accelerometer and a 3-axis Gyrosensor which is used to provide the orientation of the robot. Currently, the data is processed by Arduino Nano and then sent to the Jetson for the sensor fusion. The data obtained is not very accurate and thus it is strongly recommended to use a Razor 9DOF IMU which can be directly integrated with ROS using the following package-Razor IMU ROS. An FTDI Basic Breakout Board 3.3V will be needed to connect the IMU with the help of USB.
- Arduino Nano(Micro-controller): Currently, the Arduino Nano is used for giving PWM signals to the ESC and the servo motor for longitudinal velocity and steering control respectively. Another Nano is used for the MPU-6050 data processing.
- USB Hub (7-port Hub): Since the Jetson TX2 has only one USB port, a USB hub is needed for connecting multiple devices and sensors.
- Rotary Encoder (Encoder): The optical rotary encoder is used to provide the wheel odometry of the RACECAR. Arduino Nano is used to process the data and send the velocity of the car to the Jetson.
- Li-Polymer Battery 8000Mah(Battery): Lithium Polymer battery of 8000Mah is used to power the Jetson TX2 as well as the USB hub. This battery will easily last for more than 3 hours.
- Redgear Joystick(Joy Controller): The wireless joystick is used to control the car manually as well as toggle between manual and autonomous mode.
The whole software is based on **Robot Operating System** (ROS) running on the Jetson TX2. Various packages have been used for integrating the sensor data and processing. A brief overview of all the packages that have been used is given below-
- Rosserial: The Rosserial_arduino package is used for the interface between Arduino and the Jetson TX2. The data from IMU and encoders in transferred to jetson whereas the commands for ESC and steering are transferred from jetson to the Nano.
- RPLidar_Hector_SLAM: This package contains two main sub-packages. The first one Rplidar is used to publish the data obtained from the RP A2 lidar on the scan topic. The second one Hector_slam is used to generate a map of the environment without the help of odometry.
- Rf2o_Laser_Odometry: This package uses the scan data from the lidar to generate odometry data. This odometry data is not that accurate and thus its raw form cannot be used.
- Robot_localization: The robot localization package is used to fuse multiple sensor data. It uses an Extended Kalman Filter to provide filtered odometry data. It uses odometry data from the lidar and the encoder. In addition to that, it takes the orientation data from the IMU.
- Tf: It is necessary to define the static transform between the baselink and various sensors. Currently, the baselink to Lidar and baselink to IMU transform are published.
- Teleop: This package is used to control the car manually as well as toggle between the manual and the autonomous mode.
- Navigation: Two major packages are used. The first one is the AMCL (Adaptive Monte Carlo Localization). This package uses a particle filter for the purpose of localisation in a pre-mapped environment. The other one is the Move base. ROS move_base is used to incorporate a global planner and a local planner. The move_base package provides an implementation of an action that, given a goal in the world, will attempt to reach it with a mobile base.
- Vision: The Vision package consists of codes for lane detection and tracking.
The work on vision was done by my Co-intern.