Show simple item record

dc.contributor.authorAndigani, Razan
dc.date.accessioned2021-08-25T11:23:02Z
dc.date.available2021-08-25T11:23:02Z
dc.date.issued2021-08-19
dc.identifier.urihttp://hdl.handle.net/10754/670774
dc.description.abstractOver 70% of the Earth’s surface is covered by oceans, yet less than 5% has been explored due to the dangerous and inaccessible marine environment. Remotely operated vehicles (ROVs) allow marine scientists to explore the ocean without having to be in the ocean. The goal of this project is to allow the robot to localize itself within its environment, map its surroundings, and follow a desired path (trajectory planning). Vision System: In order to process images and record captures: 1.An image of Ubuntu MATE 18.04 was flashed to the ROV’s Raspberry Pi 3. 2.Ethernet connection between the topside computer (Ubuntu 18.04) and the Raspberry Pi was established. 3.ROS and usb_cam packages were installed. TagSLAM: To map the robot’s surrounding environment and determine its position: 1.tagslam_root packages were installed. 2.Extrinsic camera calibration was performed. 3.april_tag_detector and tagslam nodes were launched. 4.Apriltags of Tag family 36h11 were printed. 5.Odometry and camera images were published into TagSLAM. ROV Thrusters: In order to control the robot under the water and test TagSLAM: 1.Connection between the autopilot (Pixhawk) and the companion computer (Raspberry Pi 3) was established. 2.MAVROS was installed and run in the companion computer. 3.Vehicle’s mode was set to MANUAL and Failsafes in QGC were disabled. 4.Parameters were sent to actuate the thrusters using the overrideRCIn topic. Pose Estimation: To get the distance between the camera frame and an apriltag: 1.A python script that reads sensor data was written. 2.The node was run to output the ROV’s position and orientation in x, y, z. To install ROS on the companion computer, different versions of Ubuntu were flashed with the Pi image. Ubuntu MATE 18.04 was found to be the most compatible. The CSI camera on the ROV was replaced with a fisheye USB camera, which was successfully calibrated. Data of captured images and recorded videos are saved in a rosbag for later use. Apriltags were detected by the ROV and mapped on Rviz as a body_rig. Localization was achieved using the camera perception frame view. ROV’s six thrusters were successfully controlled using MAVROS nodes and topics. ROV is able to move and dive underneath the water surface. Implement PID controller for the ROV to keep a desired distance and overcome pose offset error. Apply a fractional-order control algorithm to run the robot more robustly. Automate the ROV to perform trajectory planning.
dc.relation.urlhttps://epostersonline.com//ssi2021/node/39
dc.titleSimultaneous Multi-Camera Localization And Mapping With Apriltags (Tagslam) And Trajectory Planning For Underwater Rov
dc.typePoster
dc.conference.dateAUGUST 19, 2021
dc.conference.nameSaudi Summer Internship Program (SSI) 2021
dc.conference.locationVirtual Event
refterms.dateFOA2021-08-25T11:23:02Z


Files in this item

Thumbnail
Name:
ssi2021.00c0017.NORMAL.pdf
Size:
1.083Mb
Format:
PDF

This item appears in the following Collection(s)

Show simple item record