Mentor/s
Tolga Kaya
Participation Type
Poster
Abstract
This project aims to build a robotic arm with artificial intelligence. Implementation methods included building a 3D-printed forearm, writing a Python program with OpenCV and MediaPipe to read the webcam footage and give the Arduino Uno a command sequence with five binary values depending on if the fingers were determined to be opened, which is the value ‘1’, or closed which is the value is '0' and writing a C++ program to read the given binary values from the Python program to cause each finger motor to close or open the finger. Although built, the webcam for the arm turns off randomly, preventing the arm from constantly taking in data. If this happens, the Python program also has to be manually run again. There are also limitations in the hand gestures it re-creates. For example, the fingers cannot cross over each other or move horizontally. The thumb must also be close to the palm rather than its natural position. The wrist also does not rotate. Current outcomes include running the system for a few minutes without a random webcam interruption. To solve the restriction of movement from each finger, a more articulated version of the hand is being designed. Overall, this abstract seeks to provide a comprehensive understanding of how to build an artificial intelligence arm with OpenCV and MediaPipe.
College and Major available
Computer Engineering BS, Computer Science BS
Location
Digital Commons & West Campus West Building University Commons
Start Day/Time
4-26-2024 12:00 PM
End Day/Time
4-26-2024 2:00 PM
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial 4.0 License
Prize Categories
Best Visuals, Most Creative, Best Technology Prototype
AI Robotic Arm with Hand Tracking
Digital Commons & West Campus West Building University Commons
This project aims to build a robotic arm with artificial intelligence. Implementation methods included building a 3D-printed forearm, writing a Python program with OpenCV and MediaPipe to read the webcam footage and give the Arduino Uno a command sequence with five binary values depending on if the fingers were determined to be opened, which is the value ‘1’, or closed which is the value is '0' and writing a C++ program to read the given binary values from the Python program to cause each finger motor to close or open the finger. Although built, the webcam for the arm turns off randomly, preventing the arm from constantly taking in data. If this happens, the Python program also has to be manually run again. There are also limitations in the hand gestures it re-creates. For example, the fingers cannot cross over each other or move horizontally. The thumb must also be close to the palm rather than its natural position. The wrist also does not rotate. Current outcomes include running the system for a few minutes without a random webcam interruption. To solve the restriction of movement from each finger, a more articulated version of the hand is being designed. Overall, this abstract seeks to provide a comprehensive understanding of how to build an artificial intelligence arm with OpenCV and MediaPipe.
Students' Information
Julia Piascik, Honors Computer Science and Computer Engineering Student, graduating May 2026.
Honorable Mention, Best Technology Prototype 2024 Award
Honorable Mention, Dean's Prize: Welch College of Business & Technology 2024 Award