Deploying mmWave distance sensing to Jackal UGV for more accurate navigation in hazardous environments
Abstract
The development of autonomous machines has been revolutionary in minimizing human risk in defense and rescue. Most Unmanned Ground Vehicles (UGVs) currently use vision systems such as LiDAR or stereo cameras. However, the downside of popular vision systems is that they are easily compromised in reduced light and vision conditions such as fog, smoke, and rain. This research explores millimeter wave (mmWave) radar as an alternative vision system due to its resilience to such occlusions. We provide a proof-of-concept mmWave radar-based navigation system on the Jackal UGV. Through this approach, the robot eliminates intermediary visualization steps, directly processing raw data captured by the radar and produces information used for movement. Controlling both the Jackal UGV and the mmWave radar through the Robot Operating System (ROS) allows for real-time autonomous and semi-autonomous control. This robot proved to be functional semi-autonomously through goal-based Simultaneous Localization and Mapping through the Rviz software. Point clouds and Range-Angle diagrams created from the mmWaveRadar data provide high accuracy environment mapping. This approach of combining mmWave radars with existing UGVs will allow them to function in most environments with minimal drops in accuracy when compared with other sensing methods.
Published
Issue
Section
License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.