skip to main content

Comparison of American Sign Language Use Identification using Multi-Class SVM Classification, Backpropagation Neural Network, K - Nearest Neighbor and Naive Bayes

Perbandingan Identifikasi Penggunaan American Sign Language Menggunakan Klasifikasi Multi-Class SVM, Backpropagation Neural Network, K - Nearest Neighbor dan Naive Bayes

*Vincentius Abdi Gunawan orcid scopus  -  Jurusan Teknik Informatika, Fakultas Teknik, Universitas Palangka Raya, Indonesia
Leonardus Sandy Ade Putra orcid scopus  -  Jurusan Teknik Elektro, Fakultas Teknik, Universitas Tanjungpura, Indonesia
Open Access Copyright (c) 2021 TEKNIK

Citation Format:
Abstract
Communication is essential in conveying information from one individual to another. However, not all individuals in the world can communicate verbally. According to WHO, deafness is a hearing loss that affects 466 million people globally, and 34 million are children. So it is necessary to have a non-verbal language learning method for someone who has hearing problems. The purpose of this study is to build a system that can identify non-verbal language so that it can be easily understood in real-time. A high success rate in the system needs a proper method to be applied in the system, such as machine learning supported by wavelet feature extraction and different classification methods in image processing. Machine learning was applied in the system because of its ability to recognize and compare the classification results in four different methods. The four classifications used to compare the hand gesture recognition from American Sign Language are the Multi-Class SVM classification, Backpropagation Neural Network Backpropagation, K - Nearest Neighbor (K-NN), and Naïve Bayes. The simulation test of the four classification methods that have been carried out obtained success rates of 99.3%, 98.28%, 97.7%, and 95.98%, respectively. So it can be concluded that the classification method using the Multi-Class SVM has the highest success rate in the introduction of American Sign Language, which reaches 99.3%. The whole system is designed and tested using MATLAB as supporting software and data processing.
Fulltext View|Download
Keywords: american sign language; digital image processing; wavelet; multi-class svm; backpropagation neural network; k - nearest neighbor; naive bayes

Article Metrics:

  1. Bantupalli, K., & Xie, Y. (2019a). American Sign Language Recognition using Deep Learning and Computer Vision. Proceedings - 2018 IEEE International Conference on Big Data, Big Data 2018, 4896–4899. https://doi.org/10.1109/BigData.2018.8622141
  2. Bantupalli, K., & Xie, Y. (2019b). American Sign Language Recognition using Deep Learning and Computer Vision. Proceedings - 2018 IEEE International Conference on Big Data, Big Data 2018, 4896–4899. https://doi.org/10.1109/BigData.2018.8622141
  3. Boulay, B., Bremond, F., & Thonnat, M. (2004). Human Posture Recognition in Video Sequence. Control, Automation, Robotics and Vision Conference, 2004. ICARCV 2004 8th
  4. Chong, T. W., & Lee, B. G. (2018). American sign language recognition using leap motion controller with machine learning approach. Sensors (Switzerland), 18(10). https://doi.org/10.3390/s18103554
  5. Cohen, I., Sebe, N., Garg, A., Chen, L. S., & Huang, T. S. (2003). Facial expression recognition from video sequences: temporal and static modeling. Computer Vision and image understanding, 91(1-2), 160-187. https://doi.org/10.1016/S1077-3142(03)00081-X
  6. Deriche, M., Aliyu, S., & Mohandes, M. (2019). An Intelligent Arabic Sign Language Recognition System using a Pair of LMCs with GMM Based Classification. IEEE Sensors Journal, 19(18), 1–12. https://doi.org/10.1109/JSEN.2019.2917525
  7. Eshitha, K. V., & Jose, S. (2018). Hand Gesture Recognition Using Artificial Neural Network. 2018 International Conference on Circuits and Systems in Digital Enterprise Technology, ICCSDET 2018, 2(1), 1–8. https://doi.org/10.1109/ICCSDET.2018.8821076
  8. Gatc, J., Gunawan, V. A., & Maspiyanti, F. (2016). Chlorophyll-A concentration estimation for seaweed identification in Kupang bay using MODIS aqua data. 2016 IEEE 6th International Conference on Communications and Electronics, IEEE ICCE 2016, 289–293. https://doi.org/10.1109/CCE.2016.7562651
  9. Kaggle. (n.d.). ASL Alphabet. Retrieved June 6, 2020, from https://www.kaggle.com/grassknoted/asl-alphabet/data
  10. Khelil, B., & Hamid, A. (2016). Hand gesture recognition using Leap Motion Controller. IJSR, 5(10), 436–441
  11. Kurniawan, W., & Harjoko, A. (2013). Pengenalan Bahasa Isyarat dengan Metode Segmentasi Warna Kulit dan Center of Gravity. IJEIS (Indonesian Journal of Electronics and Instrumentation Systems), 1(2), 67–78. https://doi.org/10.22146/ijeis.1964
  12. Pajar, T. Y., Purwanto, D., & Kusuma, H. (2018). Pengenalan Bahasa Isyarat Tangan Menggunakan Depth Image. Jurnal Teknik ITS, 7(1). https://doi.org/10.12962/j23373539.v7i1.28567
  13. Prateek, S. G., Jagadeesh, J., Siddarth, R., Smitha, Y., Hiremath, P. G. S., & Pendari, N. T. (2018). Dynamic Tool for American Sign Language Finger Spelling Interpreter. Proceedings - IEEE 2018 International Conference on Advances in Computing, Communication Control and Networking, ICACCCN 2018, 596–600. https://doi.org/10.1109/ICACCCN.2018.8748859
  14. Putra, L. S. A., Isnanto, R.R., Triwiyatno, A., & Gunawan, V. A. (2020). Identification of Heart Disease With Iridology Using Backpropagation Neural Network. 2018 2nd Borneo International Conference on Applied Mathematics and Engineering (BICAME). 138–142. https://doi.org/10.1109/bicame45512.2018.1570509882
  15. Putra, L. S. A., Sumarno, L., & Gunawan, V. A. (2018, October). The Recognition Of Semaphore Letter Code Using Haar Wavelet And Euclidean Function. In 2018 5th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), 759-763. IEEE
  16. Sudiatmika, I. B. K., Rahman, F., Trisno, & Suyoto. (2019). Image forgery detection using error level analysis and deep learning. Telkomnika (Telecommunication Computing Electronics and Control), 17(2), 653–659. https://doi.org/10.12928/TELKOMNIKA.V17I2.8976
  17. Tripathi, K., Baranwal, N., & Nandi, G. C. (2015). Continuous dynamic Indian Sign Language gesture recognition with invariant backgrounds. 2015 International Conference on Advances in Computing, Communications and Informatics, ICACCI 2015, 2211–2216. https://doi.org/10.1109/ICACCI.2015.7275945
  18. World Health Organization (n.d.). Deafness. Retrieved June 6, 2020, from https://www.who.int/news-room/facts-in-pictures/detail/deafness

Last update:

  1. Klasifikasi Penyakit Daun Pada Tanaman Jagung Menggunakan Algoritma Support Vector Machine, K-Nearest Neighbors dan Multilayer Perceptron

    Jaka Kusuma, Rubianto, Rika Rosnelly, Hartono, B. Herawan Hayadi. Journal of Applied Computer Science and Technology, 4 (1), 2023. doi: 10.52158/jacost.v4i1.484

Last update: 2024-11-25 00:00:47

No citation recorded.