Start Date
8-5-2024 10:25 AM
End Date
8-5-2024 10:10 AM
Document Type
Full Paper
Keywords
IMU (Inertial Measurement Units), Gesture Recognition, Assistive Robots, Real-Time Interaction, Human-Robot Interaction
Description
This paper presents a motion capture system designed to enhance the control of assistive robots, with a particular focus on improving the lives of individuals with disabilities. Utilizing lowcost Inertial Measurement Units (IMUs) to accurately capture and translate human motions into robotic actions, this system bridges the gap between human intent and robot execution. Through application of machine learning algorithms for real-time gesture recognition, the research demonstrates a significant advancement in intuitive human-robot interaction.
Key findings from the study include the motion capture system, which achieved an accuracy rate of 93.8% in gesture recognition. The integration of IMU technology facilitated a nuanced understanding of human movement, enabling the system to respond to a wide array of gestures with high precision. Moreover, the research revealed that the system’s adaptability and responsiveness were greatly enhanced by the machine learning models, with the CNN model, in particular, showing remarkable efficacy in real-time gesture classification.
DOI
https://doi.org/10.5038/PEPL8824
Realtime IMU Based Gesture Recognition For Assistive Robots
This paper presents a motion capture system designed to enhance the control of assistive robots, with a particular focus on improving the lives of individuals with disabilities. Utilizing lowcost Inertial Measurement Units (IMUs) to accurately capture and translate human motions into robotic actions, this system bridges the gap between human intent and robot execution. Through application of machine learning algorithms for real-time gesture recognition, the research demonstrates a significant advancement in intuitive human-robot interaction.
Key findings from the study include the motion capture system, which achieved an accuracy rate of 93.8% in gesture recognition. The integration of IMU technology facilitated a nuanced understanding of human movement, enabling the system to respond to a wide array of gestures with high precision. Moreover, the research revealed that the system’s adaptability and responsiveness were greatly enhanced by the machine learning models, with the CNN model, in particular, showing remarkable efficacy in real-time gesture classification.