Please use this identifier to cite or link to this item: http://dspace.ensta.edu.dz/jspui/handle/123456789/108
Title: Contribution to Gestures Recognition Using Multimodal Signals through Deep Learning
Authors: SERAICHE, Oubada
ZITOUNI, Nour El Islam
REBAI, Karima (Supervisor)
Keywords: Gesture recognition
multimodal signals
SEMG
accelerometer
eye tracking
deeplearning
Convolutional neural networks (CNN)
Long Short Time Memory (LSTM)
Issue Date: 2023
Abstract: This work aims to advance gesture recognition by combining multimodal signals, including surface electromyography (sEMG), accelerometer, and eye-tracking data, using deep learning techniques. The proposed hybrid architecture incorporates Convolutional Neural Networks (CNNs) for feature extraction and Long Short-Term Memory (LSTM) networks for time series processing, aiming to enhance gesture recognition accuracy. The evaluation involves two stages: first, assessing performance with sEMG data alone, and second, evaluating a multimodal dataset with accelerometers and eye-tracking. The MeganePro dataset is used for training deep learning algorithms to improve hand gesture recognition and develop intuitive human-computer interaction control mechanisms. This research significantly contributes to the field of gesture recognition.
Description: Projet de fin d’étude d'ingeniorat: Alger: Ecole Nationale Supérieure des Technologie Avancées(ex ENST): 2023
URI: http://dspace.edu.enst.dz/jspui/handle/123456789/108
Appears in Collections:ING- Automatique et Informatique Industrielle

Files in This Item:
File Description SizeFormat 
PFE.2023.AII.SERAICHE.Oubada_ZITOUNI.Nour.Elislam - SERAICHE Oubada.pdfPFE ING5.47 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.