ENG  Vol.8 No.10 , October 2016
Automatic Mexican Sign Language Recognition Using Normalized Moments and Artificial Neural Networks
Abstract: This document presents a computer vision system for the automatic recognition of Mexican Sign Language (MSL), based on normalized moments as invariant (to translation and scale transforms) descriptors, using artificial neural networks as pattern recognition model. An experimental feature selection was performed to reduce computational costs due to this work focusing on automatic recognition. The computer vision system includes four LED-reflectors of 700 lumens each in order to improve image acquisition quality; this illumination system allows reducing shadows in each sign of the MSL. MSL contains 27 signs in total but 6 of them are expressed with movement; this paper presents a framework for the automatic recognition of 21 static signs of MSL. The proposed system achieved 93% of recognition rate.
Cite this paper: Solís, F. , Martínez, D. and Espinoza, O. (2016) Automatic Mexican Sign Language Recognition Using Normalized Moments and Artificial Neural Networks. Engineering, 8, 733-740. doi: 10.4236/eng.2016.810066.

[1]   Li, K., Zhou, Z. and Lee, Ch. (2016) Sign Translation Modeling and Scalable Solution to Continuous Sign Language Recognition for Real-World Applications. ACM Transactions on Accessible Computing, 8, 7-23.

[2]   Wang, H., Chai, X., Hong, X., Zhao, G. and Chen, X. (2016) Isolated Sign Language Recognition with Grassman Covariance Matrices. ACM Transactions on Accessible Computing, 8, 14-21.

[3]   El-Gayyar, M.M., Ibrahim, A.S. and Wahed, M.E. (2016) Translation from Arabic Speech to Arabic Sign Language Based on Cloud Computing. Egyptian Informatics Journal. (Article in Press)

[4]   Nguyen, T., Huong, T., Vu, T., Le, T. and Vu, S. (2015) Static hand Gesture Recognition for Vietnamese Sign Language (VSL) Using Principle Components Analysis. 2015 International Conference on Communications, Management and Telecommunications (ComManTel), DaNang, 28-30 December 2015, 138-141.

[5]   Wu, J., Tian, Z., Sun, L., Estevez, L. and Jafari, R. (2015) Real-Time American Sign Language Recognition Using Wrist-Worn Motion and Surface EMG Sensors. IEEE 12th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Cambridge, 9-12 June 2015, 1-6.

[6]   Inoue, K., Shiraishi, T., Yoshioka, M. and Yanagimoto, H. (2015) Deph Sensor Based Automatic Hand Recognition Extraction by Using Time-Series Curve and Its Application to Japanese Finger-spelled Sign Language Recognition. Procedia Computer Science, 60, 371- 380.

[7]   Li, Y., Chen, X., Tian, J., Zhang, X., Wang, K. and Jang, J. (2010) Automatic Recognition of Sign Language Subwords Based on Portable Accelerometer and EMG Sensors. ICMI- MLMI’10, Beijing, 8-10 November 2010, 1, 7.

[8]   Sun, Ch., Zhang, T. and Xu, Ch. (2015) Latent Support Vector Machine Modeling for Sign Language Recognition with Kinect. ACM Transactions on Intelligent Systems and Technology, 2, 1-20.

[9]   Raheja, J., Mishra, A. and Chaudhary, A. (2016) Indian Sign Language Recognition Using SVM. Pattern Recognition and Image Analysis, 26, 434-441.

[10]   Kausar, S. and Javed, M. (2011) A Survey on Sign Language Recognition. Frontiers of Information Technology, 95-98.

[11]   Trigeiros, P., Ribeiro, F. and Paulo, L. (2014) Vision-Based Portuguese Sign Language Recognition System. Advances in Intelligent Systems and Computing, 275, 605-617.

[12]   Solís, F., Toxqui, C. and Martínez, D. (2015) Mexican Sign Language Recognition Using Jacobi-Fourier Moments. Engineering, 7, 700-705.

[13]   Hu, M. (1962) Visual Pattern Recognition by Moment Invariants. IRE Transactions on Information Theory, 49, 179-187.