DSpace Repository

A Real-Time Dynamic Gesture Variability Recognition Method Based on Convolutional Neural Networks

Show simple item record

dc.contributor.author Amangeldy, Nurzada
dc.contributor.author Milosz, Marek
dc.contributor.author Kudubayeva, Saule
dc.contributor.author Kassymova, Akmaral
dc.contributor.author Kalakova, Gulsim
dc.contributor.author Zhetkenbay, Lena
dc.date.accessioned 2024-09-12T11:48:36Z
dc.date.available 2024-09-12T11:48:36Z
dc.date.issued 2023
dc.identifier.citation Amangeldy, N.; Milosz, M.; Kudubayeva, S.; Kassymova, A.; Kalakova, G.; Zhetkenbay, L. A Real-Time Dynamic Gesture Variability Recognition Method Based on Convolutional Neural Networks. Appl. Sci. 2023, 13, 10799. https://doi.org/ 10.3390/app131910799 ru
dc.identifier.issn 2076-3417
dc.identifier.other doi.org/10.3390/app131910799
dc.identifier.uri http://rep.enu.kz/handle/enu/16281
dc.description.abstract Among the many problems in machine learning, the most critical ones involve improving the categorical response prediction rate based on extracted features. In spite of this, it is noted that most of the time from the entire cycle of multi-class machine modeling for sign language recognition tasks is spent on data preparation, including collection, filtering, analysis, and visualization of data. To find the optimal solution for the above-mentioned problem, this paper proposes a methodology for automatically collecting the spatiotemporal features of gestures by calculating the coordinates of the found area of the pose and hand, normalizing them, and constructing an optimal multilayer perceptron for multiclass classification. By extracting and analyzing spatiotemporal data, the proposed method makes it possible to identify not only static features, but also the spatial (for gestures that touch the face and head) and dynamic features of gestures, which leads to an increase in the accuracy of gesture recognition. This classification was also carried out according to the form of the gesture demonstration to optimally extract the characteristics of gestures (display ability of all connection points), which also led to an increase in the accuracy of gesture recognition for certain classes to the value of 0.96. This method was tested using the well-known Ankara University Turkish Sign Language Dataset and the Dataset for Argentinian Sign Language to validate the experiment, which proved effective with a recognition accuracy of 0.98. ru
dc.language.iso en ru
dc.publisher Applied Sciences ru
dc.relation.ispartofseries Volume 13;Issue 19
dc.subject sign language recognition ru
dc.subject CNN ru
dc.subject multimodality ru
dc.subject preprocessing ru
dc.title A Real-Time Dynamic Gesture Variability Recognition Method Based on Convolutional Neural Networks ru
dc.type Article ru


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account