A New Model for Predicting Emotional Behavior Based on Bidirectional Associative Memory

Document Type : Original Article

Author

Institute for Cognitive Sciences Studies (ICSS), Tehran, Iran

Abstract

A person's emotional state changes due to the emotional stimulus he is experiencing. During a conversation, these changes are noticeably reflected in his voice and tone of speaking as well as in the linguistic components of his speech. On the receiving side, the listener can recognize the emotional content of the received message and also, predict the emotional behavior associated with it. The human nervous and cognitive system is able to perform this process efficiently and effortlessly despite all the complexities involved in computationally defining a model for this process. In other words, it recognizes the emotional content in the speech and predict a set of related behaviors for the speaker according to that recognized emotional state. This issue is crucial in human-machine interaction studies. Here, a model based on Bidirectional Associative Memory is presented, which can predict the corresponding emotional behavior based on the emotional state. The results of the simulation show that this model is able to perform the process of mapping bidirectionally between emotional states and the set of behaviors related to emotional states and is robust against some degrees of information incompleteness and the presence of some level of noise.

Keywords


[1] Picard, R. W. (1995). Afective Computing. MIT Media Laboratory Perceptual Computing Section Technical Report, (321), 26.
[2] Franti, E., Ispas, I., Dragomir, V., Dasc, M., Alu, Zoltan, E., & Stoica, I. C. (2017). Voice Based Emotion Recognition with Convolutional Neural Networks for Companion Robots. Romanian Journal of Information Science and Technology, 20(3), 222–240.
[3] Alinezhad, B. (2010). A Study of the Relationship between Acoustic Features of “bæle” and the Paralinguistic Information, Journal of Teaching Language Skills. Shiraz, 2(1), 1–26.
[4] Hasrul, M. N., Hariharan, M., & Yaacob, S. (2012). Human Affective (Emotion) behaviour analysis using speech signals: A review. 2012 International Conference on Biomedical Engineering, 217–222.
[5] Damasio, A. R. (2005). Descartes’ Error: Emotion, Reason and the Human Brain. Penguin Books.
[6] Plutchik, R. (2001). Integration, Differentiation, and Derivatives of Emotion. Evolution and Cognition, Vol. 7, No. 2.
[7] Thornton, M. A. & Tamir, D. I. (2017). Mental models accurately predict emotion transitions. Proceedings of the National Academy of Sciences of the United States of America, 114(23), (pp. 5982–5987).
[8] Sükei, E., Norbury, A., Perez-Rodriguez, Mm. M, Olmos, P. M. & Artés, A. (2021). Predicting emotional states using behavioral markers derived from passively sensed data: Data-driven machine learning approach. JMIR mHealth and uHealth, 9(3), (pp. 1–14).
[9] Ryoo, E. C., Park, S. B. & Kim, J. K. (2013). The emotion prediction model based on audience behavior. 2013 International Conference on Information Science and Applications, ICISA 2013, (pp. 1–3).
[10] Kleinginna, P. R. & Kleinginna, A. M. (1981). A categorized list of motivation definitions, with a suggestion for a consensual definition. Motiv Emot Volume 5, Issue 3, pp 263–291.
[11] Edelman, G. M. (2004). Wider than the sky: The phenomenal gift of consciousness. Yale University Press, USA.
[12] Nolen-Hoeksema, S., Fredrichson, B. L., Loftus, G. R. & Wagenaar, W. A. (2009). Atkinson & Hilgard's Introduction to Psychology (15th ed.). Cengage Learning EMEA, Italy.
[13] Levenson, R. W. (1994). Human emotions: A functional view. In P. Ekman & R. Davidson (Eds.), The nature of emotion: Fundamental questions. New York: Oxford University Press, 123–126.
[14] Ax, A. (1953). The physiological differentiation between fear and anger in humans. Psychosomatic Medicine, 15, 433–442.
[15] Funkenstein, D. (1955). The physiology of fear and anger. Scientific American, 192, 74–80.
[16] Levenson, R. W., Ekman, P., & Friesen, W. V. (1990). Voluntary facial action generates emotion-specific nervous system activity. Psychophysiology, 27, 363–384.
[17] Thagard P. (2005). Mind: Introduction to cognitive science. Cambridge, MA: MIT Press.
[18] Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178.
[19] Cowie, R., Douglas- Corive, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, 18(1), 32–80.
[20] Nwe, T. L., Foo, S. W., & De Silva, L. C. (2003). Speech emotion recognition using hidden Markov models. Speech Communication, 41(4), 603–623.
[21] Baars, B., & Gage, N. (2013). Fundamentals of Cognitive Neuroscience: A Beginner's Guide. Academic Press.
[22] Fuster, J. M. (2004). Upper processing stages of the perception action cycle. Trends in Cognitive Sciences, 8(4), 143–145.
[23] Bose, N. K. & Liang, P. (1996). Neural Network Fundamentals with Graphs, Algorithms and Applications, McGraw-Hill, USA.
[24] Kartalopoulos, S. V. (1996). Understanding Neural Networks and Fuzzy Logic: Basic Concepts and Applications. IEEE Press.
[25] Kosko, B. (1987). Adaptive bidirectional associative memories. Appl. Opt., 26(23), 4947–4960.
[26] Yik, M., Russell, J. A., & Steiger, J. H. (2011). A 12-Point Circumplex Structure of Core Affect. American Psychological Association. 11(4), 705–731.
[27] Scherer, K. R., Shuman, V., Fontaine, J. R. J., & Soriano, C. (2013). The GRID meets the Wheel: Assessing emotional feeling via self-report. In J. R. J. Fontaine, K. R. Scherer, & C. Soriano (Eds.), Components of emotional meaning: A sourcebook. Oxford: Oxford University Press.
[28] Desmet, P. M. A., & Hekkert, P. (2007). Framework of Product Experience. International Journal of Design, 1(1), 13-23.