Feature Extraction of Facial Electromyograph (EMG) Signal for Aceh Languages Speech using Discrete Wavelet Transform (DWT)

  • Darma Setiawan Putra Politeknik Aceh Selatan
  • Yuril Umbu WW Politeknik Aceh Selatan

Abstract

The facial electromyograph (FEMG) signal is a signal that occurs in the muscles of the contracted human face. This FEMG signal is one of the techniques used to study human speech recognition. It can be acquired by placing an electrode surface on the skin around the facial articulation muscle. Three types of muscles in this study are the masseter, risorius and depressor muscle. This study aims to extract and analyze the features in the FEMG signal. The extraction method is the discrete wavelet transform (DWT). The type of wavelet transform is Daubechies2 with level 5. After extraction and analysis of FEMG signals, the FEMG signal pattern for each spoken word indicated by differences in the approximation and detail coefficient of the FEMG signal. In addition, the level of difference in the FEMG signal pattern is also indicated by the histogram of the approximation coefficient of the FEMG signal. Thus, the discrete wavelet transform method can be used as one of the methods for extracting the FEMG signal feature in a human facial electromyograph (FEMG) signal.

References

[1] G. L. Read, “Facial Electromyography (EMG),” Int. Encycl. Commun. Res. Methods, no. January, pp. 1–10, 2017.

[2] Y. Golland, A. Hakim, T. Aloni, S. Schaefer, and N. Levit-Binnun, “Affect dynamics of facial EMG during continuous emotional experiences,” Biol. Psychol., vol. 139, pp. 47–58, 2018.

[3] A. Mavratzakis, C. Herbert, and P. Walla, “Emotional facial expressions evoke faster orienting responses, but weaker emotional responses at neural and behavioural levels compared to scenes: A simultaneous EEG and facial EMG study,” Neuroimage, vol. 124, pp. 931–946, 2016.

[4] M. Hamedi, S. H. Salleh, K. Ismail, A. M. Noor, M. Astaraki, and H. Aslian, “Time-frequency facial gestures EMG analysis using bilinear distribution,” Int. Conf. Signal Image Process. Appl. ICSIPA, pp. 169–173, 2016.

[5] Y. Cai, Y. Guo, H. Jiang, and M.-C. Huang, “Machine-learning approaches for recognizing muscle activities involved in facial expressions captured by multi-channels surface electromyogram,” Smart Heal., vol. 5–6, pp. 15–25, 2018.

[6] S. Orguc, H. S. Khurana, K. M. Stankovic, H. S. Leel, and A. P. Chandrakasan, “EMG-based Real Time Facial Gesture Recognition for Stress Monitoring,” in 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2018, pp. 2651–2654.

[7] S. . Mostafa, M. . Awal, M. Ahmad, and M. . Rashid, “Voiceless Bangla vowel recognition using sEMG signal,” Springerplus, 2016.

[8] N. Srisuwan, P. Phukpattaranont, and C. Limsakul, “Three Steps of Neuron Network Classification for EMG-based Thai Tones Speech Recognition,” in International Conference on Electrical Engineering/Electronics, Computer, Telecommunication and Information Technology, 2013, pp. 0–5.

[9] M. Lyu, C. Xiong, and Q. Zhang, “Electromyography (EMG)-based Chinese voice command recognition,” in 2014 IEEE International Conference on Information and Automation (ICIA), 2014, pp. 926–931.

[10] M. W. Soon, M. I. H. Anuar, M. H. Z. Abidin, A. S. Azaman, and N. M. Noor, “Speech recognition using facial sEMG,” in 2017 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), 2017, pp. 1–5.

[11] Abdussalam, E. H. Rachmawanto, N. A. Setiyanto, D. R. I. M. Setiadi, and C. A. Sari, “Optimasi Keamanan Watermarking pada Daubechies Transform Berbasis Arnold Cat Map,” J. Inform. J. Pengemb. IT, vol. 04, no. 01, pp. 31–37, 2019.

[12] E. Lopez-larraz, O. M. Mozos, J. M. Antelis, and J. Minguez, “Syllable-Based Speech Recognition Using EMG,” in International Conference of the IEEE Engineering in Medicine and Biology, 2010.
Published
2019-07-10
How to Cite
PUTRA, Darma Setiawan; WW, Yuril Umbu. Feature Extraction of Facial Electromyograph (EMG) Signal for Aceh Languages Speech using Discrete Wavelet Transform (DWT). Jurnal Inotera, [S.l.], v. 4, n. 1, p. 31-40, july 2019. ISSN 2581-1274. Available at: <http://inotera.poltas.ac.id/index.php/inotera/article/view/73>. Date accessed: 18 july 2019. doi: https://doi.org/10.31572/inotera.Vol4.Iss1.2019.ID73.

Most read articles by the same author(s)