A MODIFIED FUZZY SUPPORT VECTOR MACHINE CLASSIFICATION-BASED APPROACH FOR EMOTIONAL RECOGNITION USING PHYSIOLOGICAL SIGNALS

Volume 5 (2), December 2022, Pages 286-317

Sara Mahdi1 and Mohammad Bagher Menhaj2


1Amirkabir University of Technology, Tehran, Iran, This email address is being protected from spambots. You need JavaScript enabled to view it.

2Islamic Azad University, Qazvin Branch, Qazvin, Iran, This email address is being protected from spambots. You need JavaScript enabled to view it.


Abstract

Emotional state recognition has become an essential topic for human–robot interaction researches that diverted and covers a wide range of topics. By specifying emotional expressions, robots can identify the significant variables of human behavior and apply them to communicate in a very human-like fashion and develop interaction possibilities. The multimodality and spontaneity nature of human emotions make them hard to be recognized by robots. Each modality has its advantages and limitations, which, along with the unstructured behavior of spontaneous facial expressions, make several challenges for the proposed approaches in the literature. The most important of these approaches is based on a combination of explicit feature extraction methods and manual modality. This paper proposes a modified fuzzy support vector machine (FSVM) classification-based approach for emotional recognition using physiological signals. The main contribution of this study includes applying various data extraction indices and proper kernels for the FSVM classification method and evaluating the signal's richness in experimental tests. The developed emotional recognition method is also compared with conventional SVM and other existing state-of-the-art emotional recognition algorithms. The comparison results show an improved accuracy of the developed method over other approaches.

Keywords:

Emotional Recognition, Physiological Signals, Support Vector Machine, Fuzzy Classification.

DOI: https://doi.org/10.32010/26166127.2022.5.2.286.317

 

 

Reference 

Agrafioti, F., Hatzinakos, D., & Anderson, A. K. (2011). ECG pattern analysis for emotion detection. IEEE Transactions on affective computing, 3(1), 102-115.

Barros, P., Jirak, D., Weber, C., & Wermter, S. (2015). Multimodal emotional state recognition using sequence-dependent deep hierarchical features. Neural Networks, 72, 140-151.

Chanel, G., Kronegg, J., Grandjean, D., & Pun, T. (2006, September). Emotion assessment: Arousal evaluation using EEG’s and peripheral physiological signals. In International workshop on multimedia content representation, classification and security (pp. 530-537). Springer, Berlin, Heidelberg.

Christy, T., Kuncheva, L. I., & Williams, K. W. (2012). Selection of physiological input modalities for emotion recognition. UK: Bangor University.

Dai, K., Fell, H. J., & MacAuslan, J. (2008). Recognizing emotion in speech using neural networks. Telehealth and Assistive Technologies, 31, 38-43.

Den Uyl, M. J., & Van Kuilenburg, H. (2005, August). The FaceReader: Online facial expression recognition. In Proceedings of measuring behavior (Vol. 30, No. 2, pp. 589-590). Netherlands: Wageningen Publishers.

Egger, M., Ley, M., & Hanke, S. (2019). Emotion recognition from physiological signal analysis: A review. Electronic Notes in Theoretical Computer Science, 343, 35-55.

Gouizi, K., Bereksi Reguig, F., & Maaoui, C. (2011). Emotion recognition from physiological signals. Journal of medical engineering & technology, 35(6-7), 300-307.

Guo, H. W., Huang, Y. S., et al. (2016, October). Heart rate variability signal features for emotion recognition by using principal component analysis and support vectors machine. In 2016 IEEE 16th international conference on bioinformatics and bioengineering (BIBE) (pp. 274-277). IEEE.

Haag, A., Goronzy, S., Schaich, P., & Williams, J. (2004, June). Emotion recognition using bio-sensors: First steps towards an automatic system. In Tutorial and research workshop on affective dialogue systems (pp. 36-48). Springer, Berlin, Heidelberg.

Haag, A., Goronzy, S., Schaich, P., & Williams, J. (2004, June). Emotion recognition using bio-sensors: First steps towards an automatic system. In Tutorial and research workshop on affective dialogue systems (pp. 36-48). Springer, Berlin, Heidelberg.

Hakamata, A., Ren, F., & Tsuchiya, S. (2008, October). Human emotion model based on discourse sentence for expression generation of conversation agent. In 2008 International Conference on Natural Language Processing and Knowledge Engineering (pp. 1-8). IEEE.

Hong, K., Liu, G., Chen, W., & Hong, S. (2018). Classification of the emotional stress and physical stress using signal magnification and canonical correlation analysis. Pattern Recognition, 77, 140-149.

Hossain, M. S., & Muhammad, G. (2019). Emotion recognition using deep learning approach from audio–visual emotional big data. Information Fusion, 49, 69-78.

Katsis, C. D., Katertsidis, N., Ganiatsas, G., & Fotiadis, D. I. (2008). Toward emotion recognition in car-racing drivers: A biosignal processing approach. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 38(3), 502-512.

Katsis, C. D., Katertsidis, N., Ganiatsas, G., & Fotiadis, D. I. (2008). Toward emotion recognition in car-racing drivers: A biosignal processing approach. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 38(3), 502-512.

Kessous, L., Castellano, G., & Caridakis, G. (2010). Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. Journal on Multimodal User Interfaces, 3(1), 33-48.

Khalili, Z., & Moradi, M. H. (2008, December). Emotion detection using brain and peripheral signals. In 2008 Cairo international biomedical engineering conference (pp. 1-4). IEEE.

Khosrowabadi, R., & bin Abdul Rahman, A. W. (2010, December). Classification of EEG correlates on emotion using features from Gaussian mixtures of EEG spectrogram. In Proceeding of the 3rd International Conference on Information and Communication Technology for the Moslem World (ICT4M) 2010 (pp. E102-E107). IEEE.

Kim, J., & André, E. (2008). Emotion recognition based on physiological changes in music listening. IEEE transactions on pattern analysis and machine intelligence, 30(12), 2067-2083.

Kim, K. H., Bang, S. W., & Kim, S. R. (2004). Emotion recognition system using short-term monitoring of physiological signals. Medical and biological engineering and computing, 42(3), 419-427.

Koelstra, S., Muhl, C., et al. (2011). Deap: A database for emotion analysis; using physiological signals. IEEE transactions on affective computing, 3(1), 18-31.

Lahane, P., & Sangaiah, A. K. (2015). An approach to EEG based emotion recognition and classification using kernel density estimation. Procedia Computer Science, 48, 574-581.

Liu, C., Conn, K., Sarkar, N., & Stone, W. (2008). Physiology-based affect recognition for computer-assisted intervention of children with Autism Spectrum Disorder. International journal of human-computer studies, 66(9), 662-677.

Maaoui, C., & Pruski, A. (2010). Emotion recognition through physiological signals for human-machine communication. Cutting edge robotics, 2010(317-332), 11.

Mehmood, R. M., & Lee, H. J. (2016). A novel feature extraction method based on late positive potential for emotion recognition in human brain signal patterns. Computers & Electrical Engineering, 53, 444-457.

Mi, L., Liu, X., Ren, F., & Araki, H. (2009). Characteristics of event-related potentials in recognition processes of japanese kanji and sentences for chinese bilinguals. Journal of physiological anthropology, 28(4), 191-197.

Mi, L., Liu, X., Ren, F., & Araki, H. (2009). Characteristics of event-related potentials in recognition processes of japanese kanji and sentences for chinese bilinguals. Journal of physiological anthropology, 28(4), 191-197.

Naji, M., Firoozabadi, M., & Azadfallah, P. (2014). Classification of music-induced emotions based on information fusion of forehead biosignals and electrocardiogram. Cognitive Computation, 6(2), 241-252.

Picard, R. W. (2010). Affective computing: from laughter to IEEE. IEEE Transactions on Affective Computing, 1(1), 11-17.

Rattanyu, K., Ohkura, M., & Mizukawa, M. (2010, October). Emotion monitoring from physiological signals for service robots in the living space. In ICCAS 2010 (pp. 580-583). IEEE

Rigas, G., Katsis, C. D., Ganiatsas, G., & Fotiadis, D. I. (2007, July). A user independent, biosignal based, emotion recognition method. In International Conference on User Modeling (pp. 314-318). Springer, Berlin, Heidelberg.

Yin, Z., Zhao, M., Wang, Y., Yang, J., & Zhang, J. (2017). Recognition of emotions using multimodal physiological signals and an ensemble deep learning model. Computer methods and programs in biomedicine, 140, 93-110.

Zhang, B., Morère, Y., et al. (2017). Reaction time and physiological signals for stress recognition. Biomedical Signal Processing and Control, 38, 100-107.