The use of autonomous landing of aerial vehicles is increasing in demand. Applications of this ability can range from simple drone delivery to unmanned military missions. To be able to land at a spot identified by local information, such as a visual marker, creates an efficient and versatile solution. This allows for a more user/consumer friendly device overall. To achieve this goal the use of computer vision and an array of ranging sensors will be explored. In our approach we utilized an April Tag as our location identifier and point of reference. MATLAB/Simulink interface was used to develop the platform environment.
"Walmart has since upgraded its experimentation to delivering COVID-19 tests in the area around its store location in North Las Vegas"--p.1.
Bitencourt, G., Brown, E. J., Bleimling, C., Lai, G., Molki, A., & Kaya, T. (2021, May). Autonomous aerial vehicle vision and sensor guided landing. IEEE International Conference of Electro/Information Technology, Central Michigan University, Mount Pleasant, MI.
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.
Artificial Intelligence and Robotics Commons, Computer Engineering Commons, Navigation, Guidance, Control, and Dynamics Commons
This research was a collaboration between SHU Engineering and Quanser on a drone system. First international conference paper by undergraduate students Bitencourt and Brown.
This project was also presented at the 2021 Academic Festival, where it received Honorable mention, Dean's Prize: Welch College of Business and Technology.