RobotX Challenge 2022
Thank you to our team sponsors: RoboNation, Beck Foundation, Gibbs & Cox, US Government, Department of Electrical and Computer Engineering
The goal of this project is to develop an autonomous system to complete the 2022 RobotX competition objectives employing a wave adaptive modular vessel (WAM-V) guided by a sensing suite and assisted by an unmanned aerial vehicle (UAV). This project is a collaboration across the mechanical, electrical and computer engineering, and computer science departments and includes three teams of seniors and juniors. Team Athena is leading the tasks associated with sensing and object detection, Team Hermes is responsible for the development of the UAV, and Team Poseidon is overseeing the WAM-V. The team was organized with capstone design projects during the 2021-2022 academic year, with a number of students and alums continuing to support the project during the summer of 2022 in the lead-up to the November 2022 competition.
Team Athena—Sensing System
The primary function of the sensing system is to detect objects and obstacles during the competition. Using a combination of sensors including LiDAR, proximity sensors, conventional cameras, and a hydrophone. Furthermore, the team is tasked with developing a communication network between the UAV, WAM-V, Team Computer, and the Judges Table. This is to coordinate data transfer and ensure all systems of the two-vehicle system are working safely and effectively.
Team Lead: Reginald Lockhart
Team Members: Roosevelt Vidal Silva, Gopal Uprety (Electrical Engineering), Tahmid Wasif (Computer Engineering), John Kliem (Computer Science)
Faculty Advisors: Leigh McCue, Greg Stein (Computer Science), Jana Košecká (Computer Science)
Acknowledgements: William “Aldo” K., US Government; Dave Edwards
Team Athena is honored to take part in the annual RobotX Competition. This competition, which will be held in Australia, will pit teams from across the world against each other to design and test unmanned, dual-vehicle systems. Our team’s responsibility will focus on object and obstacle detection, as well as communication for data and telemetry. Using a suite of different sensors, tasks set forth by RoboNation will be completed and be capable of acting autonomously during the competition.
LiDAR
In this project the purpose of LiDAR is to map the environment and find the distance between WAM-V and obstacles. This sensor works by sending non-visible wavelengths of light, then recording the time it takes to reflect back to the sensor to determine direction as well as distance to an object.
RoboNation provided the team a Velodyne VLP-16 LiDAR, a reliable, power-efficient LiDAR designed for low-speed autonomy applications such as those required for this competition. The LiDAR will be mounted high on the WAM-V platform so that LiDAR’s view is not obstructed.
Camera Systems
The camera system is designed for classifying the obstacles as well as detecting color sequences in the competition. These sequences are used for entering and exiting gates during the competition. An onboard controller will process the image and share information between the sensing system and the navigation team through a shared framework named the Robot Operating System (ROS).
Proximity Sensors
The ultrasonic proximity sensor will be used as a secondary form of object detection that will assist with close-range object detection, less than 20 feet. In the event the LiDAR sensor fails, the ultrasonic proximity sensor will take the role of providing critical data to return the WAM-V to safety. These proximity sensors work in a similar way as the LiDAR, except they utilize sound waves to determine distance rather than light waves.
Mapping
The raw data from LiDAR is transformed into 2-D laser scan data, and using the laser scan data, horizontal distance between the WAM-V and obstacles is calculated. A 2-D map, along with the bounding box of the classified objects/obstacles, of the environment is created using a Simultaneous Localization and Mapping (SLAM) object detection algorithm in ROS environment.
Team Hermes—UAV
The primary function of the UAV is to provide some navigational support to the WAM-V, collect hyperspectral images to identify “animals” for the boat and do object retrieval. The UAV will be adapted into an autonomous system with the addition of a companion computer and will be outfitted for resilience in a maritime environment.
Team Lead: Yasmin Alamin
Team Members: Nick Denton, Danial Hernandez, Dahway Lin, Aaron Logan (Electrical and Computer Engineering), Emina Sinaovic (Computer Science)
Faculty Advisors: Daigo Shishika, Cameron Nowzari (Electrical and Computer Engineering)
Acknowledgements: Dave Edwards
Team Hermes elected to modify a commercial unmanned aerial vehicle (UAV) with a large payload capacity, long flight time, and open-source software dedicated towards autonomous operation. These capabilities were all found in the UAVSI Tarot 650, which has a 1.5 kilogram payload capacity, up to 25 minutes of flight time, and Ardupilot compatibility. After purchasing the UAV, the team waterproofed the electronics and equipped it with a flotation device, multiple cameras, and an electromagnet for package delivery and retrieval. One of these cameras is a hyperspectral camera, which enables the UAV to analyze an object’s spectral signature. Sitting in the “pilot” seat would be the Jetson Nano, a powerful single-board computer that makes autonomy possible.
Development and Fabrication
The UAV was waterproofed with watertight enclosures for sensitive electronics; devices were additionally treated with silicone gel. The flotation system operates with a CO2 canister device that inflates a balloon on water contact. Device mounts were 3-D printed, which allowed for rapid prototyping along with lightweight designs; with an electric quadcopter, ounces are pounds! Software development for the UAV consisted of leveraging open-source resources to consolidate the programs needed. The UAV flight controller operates on Ardupilot software, and the Jetson Nano utilizes Robot Operating System, OpenCV, and others. Altogether, this software suite can provide autonomy and object recognition for our UAV in a compact Python program.
Testing
The first phase of testing gathered flight time data and examined the maneuverability of the UAV; to reduce the chance of equipment damage, the payload was replaced with lead weights. This phase revealed that the UAV, fully loaded, could remain airborne for approximately 20 minutes with a 15 percent battery margin for safety. Additionally, the UAV remained stable and effectively compensated for wind drift. Flotation was tested with a spare frame and weights; the device comfortably floated the test frame in water. Finally, a telemetry connection was established between the onboard flight controller and the Jetson Nano, enabling remote control of the UAV through the MAVROS command line.
Future Work
With the competition six months away, Team Hermes has fine- tuning objectives to complete. The most important aspect is autonomy; the team will create increasingly complex scripts for the UAV, ranging from simple “search and rescue” flight patterns to a competition-level program with full integration into the WAM-V and land systems. More intensive flight testing will also examine the resiliency of hardware mounts, as well as the capabilities of the equipment attached to the UAV.
Team Poseidon—WAM-V
The primary functions of the boat team are to develop propulsion systems and navigation systems. Additionally, this team is responsible for the integration, power management, and mass balance of all teams.
Team Lead: Jafar Hussainy
Team Members: Aditya Pulipaka, Jacob Sweeney, Enea Didi (Computer Engineering)
Faculty Advisors: Leigh McCue, Erion Plaku (Computer Science)
Acknowledgements: Dave Edwards, Johnnie Hall, IV
Team Poseidon was required to develop propulsion and navigation systems for the boat to autonomously navigate waters efficiently and safely to enable completing the competition tasks.
Design Concept
Propulsion was designed using differential thrust from a pair of trolling motors attached to the stern at a fixed angle. The motors are then powered by a pair of 24-volt waterproof batteries. The mounts are made from welded aluminum and bolted to the buoyancy pods on the wave adaptive modular vessel (WAM-V) where the motors can be easily attached and removed for efficiency.
For transportation of the WAM-V, we were given a small boat trailer to modify. After developing and evaluating three designs, a fully assembled open-trailer design was chosen after considering size, safety, and ease of transport. The trailer was fitted with marine grade bunk boards for the inflatable pontoons to sit on, which can be secured using ratchet straps. A wooden panel is also placed at the center of the trailer to hold the aluminum pods, motors, and any other propulsion accessories.
The batteries, being the heaviest component, were strapped into a battery box screwed into each pontoon, keeping the platform free for the other sub-teams’ components and the center of mass as low as possible. The platform was fitted with a t-slot superstructure to support modular attachment for the other sub-teams’ drone platform and sensors.
Navigation programming was developed using a Pixhawk 4 flight controller with ArduRover firmware. GPS data, LiDAR data, and accelerometer data were taken to navigate between waypoints using our differential thrust motor layout. Navigation to the waypoints is done using “Dijkstra’s shortest path” algorithm while obstacles are avoided using a variation of the “bendy ruler” algorithm. Custom Lua scripts are used for other RobotX tasks that require specific avoidance patterns.
Testing and Fabrication
Cutting, welding, and drilling of aluminum plates/ tubes were completed with the help of Johnnie Hall. The WAM-V’s engine pods had to be modified with a thick aluminum tube for secure attachment of our Newport trolling motors. Eight-foot-long aluminum square tubes were bolted onto the old boat trailer to support the weight and width of the WAM-V. Long carpet covered 2 x 4 wood planks were used to avoid damaging the inflatable rubber pontoons. Software called Gazebo was used to create an environment to test navigation code on a virtual pontoon boat under the stresses of virtual wind and waves.
Future Work Recommendations
The team suggests developing systems with trolling motors that can rotate as well as using propulsion systems of multiple motors for greater maneuverability. This year was Mason’s first year participating, so value was placed on function rather than optimal or most elegant solutions.
Interested in learning more about the team? Check out this episode of The Mason Mechanical Engineer podcast from Capstone Day 2022, which features interviews with team members. Or get to know the travel team, with our team video embedded below!
Other posts to this website about the Mason RobotX team can be found by searching for the tag RobotX. To contact Mason RobotX, please email robotx@gmu.edu.