dc.description.abstract | This thesis presents a customizable hand gesture recognition system designed for natural human machine interactions. This system is based on the recognition of ASL (American Sign Language) letters as well as the tracking of the user’s hand movement. It can detect static signs (single ASL letter), composed gestures (sequence of ASL letters) and dynamic gestures (ASL letter combined with hand movement path). It is also designed to handle the various actions associated to them and providing feedback to the user. One of the key features of this system is its flexibility, allowing the user to add more gestures and easily modify existing ones by associating a sign and a movement path, or defining a sequence of static signs. The study begins with the motivations for developing a customizable gesture recognition system and outlines the challenges of existing systems that lack adaptability. It then details the design and implementation of the system, which leverages advanced machine learning techniques, including Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Dynamic Time Warping (DTW). These techniques are integrated into a modular framework, capable of distinguishing and recognizing the static, composed, and dynamic gestures. The implementation phase covers the data collection and creation, and the preprocessing pipeline, including the extraction of the hand landmarks and their transformation into usable data for training our models. The evaluation phase demonstrates the system’s high accuracy and robustness across various metrics, including accuracy, loss and F1-score. Finally, the gesture customization and the different human-machine interactions are addressed, demonstrating the ease of use of the system and its real-world applications. | en_US |