Intelligent Navigation of Autonomous Maritime Robots (Work in Progress)

Published:

Supervisors: Prof. Katherine A. Skinner and Dr. Elena Shrestha



Brief: While reinforcement learning offers potential for continual learning and adaptability in complex scenarios, its application to real-world robotics faces significant challenges. Unlike in simulations, physical platforms struggle to collect a diverse corpus of training data due to critical safety risks and the inherent constraints of operating within a dynamic and partially observable environment. Our work draws inspiration from the human capability to fuse and exploit multiple sensing modalities, construct comprehensive models of how the world operates, and then leverage those models to adeptly navigate in challenging and often unpredictable environments. Here is an overview of how unmanned vehicles (ground, air, and surface) can exploit a world model constructed through multimodal perception, to learn near-optimal policies for guidance and control. A key aspect of the approach is learning from imagination in which the world model is used to simulate future imagined trajectories, enabling it to anticipate potential risks before encountering them in the real world. Our ongoing work and long-term vision is to evolve the traditional sense-plan-act framework into a more intuitive and cognitively inspired sense-imagine-act model. Dr. Elena Shrestha's presentation of our research.

Role: Graduate Research Assistant, Field Robotics Group

Contribution:

- Designed and programmed the URDF model for the Heron Unmanned Surface Vehicle (USV) by Clearpath Robotics and developed a comprehensive "world" model for the University of Michigan’s Marine Hydrodynamics Lab using ROS2 and Gazebo Garden. This setup enabled thorough testing of autonomous control systems and supported reinforcement learning research, with an emphasis on system dynamics and mechanical response in marine environments.
- Developed and implemented an object avoidance algorithm using Python and C++, integrating it with the mechanical control systems of the Heron USV. The algorithm’s performance was rigorously tested in both real-world and simulated environments, where I refined tuning parameters to enhance accuracy and robustness.
- Prepared the Heron USV for real-world testing, including setting up the battery, electrical systems, and mechanical components. I customized and 3D-printed parts to accommodate sensors and connected LIDAR, camera, IMU, and odometry sensors to the robot via ROS, enabling smooth data acquisition and sensor integration.
- Engineered a solution to fuse sensor data by aligning the laser/scan frame with the odometry/filtered transform frame, effectively compensating for the absence of the /tf topic across multiple runs. This ensured accurate transformations between key frames (base_link, velodyne_base_link, and velodyne), maintaining correct timestamp synchronization, which was crucial for using the hector_slam package to run SLAM in RViz for trajectory analysis.
- Analyzed autonomous trajectories by extracting and plotting data from LIDAR scans, evaluating the vehicle’s navigation performance. I assessed its ability to maintain stable motion during “straightline” tests and to successfully maneuver through obstacles in “2 buoys” tests under varying wave conditions, focusing on the mechanical and control aspects of the system.
- Led real-world testing to gather critical sensor data, including LIDAR, camera, odometry, IMU, and velocity measurements, focusing on the mechanical response and navigation control of the Heron USV. I replicated this data in simulation environments, utilizing advanced post-processing techniques in Python to evaluate and optimize system performance. Additionally, I used the collected data to train a reinforcement learning model, both offline and in simulation, to improve the USV’s autonomous decision-making capabilities.

Simulation: Autonomous Object Avoidance

Real-world test: Autonomous Object Avoidance

[GitHub][Publication]

Skills: ROS, ROS2, Simulation (Rviz, Gazebo), Sensor Integration (LIDAR, Camera, IMU), SLAM, Machine Learning (PyTorch, TensorFlow), Control Systems (PID), Computer Vision (OpenCV), Python, C++, Linux, Bash/Shell Scripting, Git, Debugger, Microcontroller, PWM, SolidWorks
Contributors' Acknowledgement: