Deep Learning for Classification of Peak Emotions within Virtual Reality Systems

  • Denise Quesnel
  • Steve DiPaola
  • Bernhard E. Riecke

Abstract

Research has demonstrated well-being benefits from positive, ‘peak’ emotions such as awe and wonder, prompting the HCI community to utilize affective computing and AI modelling for elicitation and measurement of those target emotional states. The immersive nature of virtual reality (VR) content and systems can lead to feelings of awe and wonder, especially with a responsive, personalized environment based on biosignals. However, an accurate model is required to differentiate between emotional states that have similar biosignal input, such as awe and fear. Deep learning may provide a solution since the subtleties of these emotional states and affect may be recognized, with biosignal data viewed in a time series so that researchers and designers can understand which features of the system may have influenced target emotions. The proposed deep learning fusion system in this paper will use data collected from a corpus, created through collection of physiological biosignals and ranked qualitative data, and will classify these multimodal signals into target outputs of affect. This model will be real-time for the evaluation of VR system features which influence awe/wonder, using a bio-responsive environment. Since biosignal data will be collected through wireless, wearable sensor technology, and modelled through the same computer powering the VR system, it can be used in field research and studios.

Downloads

Download data is not yet available.

References

[1] Keltner, D. and Haidt, J. 2003. Approaching awe, a moral, spiritual, and aesthetic emotion. Cognition and Emotion. 17, 2 (Jan. 2003), 297–314. DOI:https://doi.org/10.1080/02699930302297.

[2] Piff, P.K. et al. 2015. Awe, the small self, and prosocial behavior. Journal of Personality and Social Psychology. 108, 6 (2015), 883–899. DOI:https://doi.org/10.1037/pspi0000018.

[3] Riva, G. et al. 2016. Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change. Frontiers in Psychiatry. 7, (2016), 164. DOI:https://doi.org/10.3389/fpsyt.2016.00164.

[4] Shiota, M.N. et al. 2011. Feeling good: autonomic nervous system responding in five positive emotions. Emotion (Washington, D.C.). 11, 6 (Dec. 2011), 1368–1378. DOI:https://doi.org/10.1037/a0024278.

[5] Shiota, M.N. et al. 2007. The nature of awe: Elicitors, appraisals, and effects on self-concept. Cognition and Emotion. 21, 5 (Aug. 2007), 944–963. DOI:https://doi.org/10.1080/02699930600923668.

[6] Rudd, M. et al. 2011. Awe Expands People’s Perception of Time, Alters Decision Making, and Enhances Well-Being. NA - Advances in Consumer Research Volume 39. (2011). DOI:https://doi.org/10.1177/0956797612438731.

[7] Stellar, J.E. et al. 2015. Positive affect and markers of inflammation: discrete positive emotions predict lower levels of inflammatory cytokines. Emotion (Washington, D.C.). 15, 2 (Apr. 2015), 129–133. DOI:https://doi.org/10.1037/emo0000033.

[8] Grewe, O. et al. 2009. The Chill Parameter: Goose Bumps and Shivers as Promising Measures in Emotion Research. Music Perception: An Interdisciplinary Journal. 27, 1 (Sep. 2009), 61–74. DOI:https://doi.org/10.1525/mp.2009.27.1.61.

[9] Benedek, M. and Kaernbach, C. 2011. Physiological correlates and emotional specificity of human piloerection. Biological Psychology. 86, 3 (Mar. 2011), 320–329. DOI:https://doi.org/10.1016/j.biopsycho.2010.12.012.

[10] Silvia, P.J. et al. 2015. Openness to experience and awe in response to nature and music: Personality and profound aesthetic experiences. Psychology of Aesthetics, Creativity, and the Arts. 9, 4 (2015), 376–384. DOI:https://doi.org/10.1037/aca0000028.

[11] Chirico, A. et al. 2016. The Potential of Virtual Reality for the Investigation of Awe. Human-Media Interaction. (2016), 1766. DOI:https://doi.org/10.3389/fpsyg.2016.01766.

[12] Quesnel, D., & Riecke, B. E. (2017). Awestruck: Natural interaction with virtual reality on eliciting awe (pp. 205–206). IEEE. https://doi.org/10.1109/3DUI.2017.7893343

[13] Benedek, M. et al. 2010. Objective and continuous measurement of piloerection. Psychophysiology. 47, 5 (Sep. 2010), 989–993. DOI:https://doi.org/10.1111/j.1469-8986.2010.01003.x.

[14] Gallagher, S. et al. 2015. A Neurophenomenology of Awe and Wonder. Palgrave Macmillan UK.

[15] Grewe, O. et al. 2011. Chills in different sensory domains: Frisson elicited by acoustical, visual, tactile and gustatory stimuli. Psychology of Music. 39, 2 (Apr. 2011), 220–239. DOI:https://doi.org/10.1177/0305735610362950.

[16] Li, L. and Chen, J. h 2006. Emotion Recognition Using Physiological Signals from Multiple Subjects. 2006 International Conference on Intelligent Information Hiding and Multimedia (Dec. 2006), 355–358.

[17] Szwoch, W. 2015. Emotion Recognition Using Physiological Signals. Proceedings of the Mulitimedia, Interaction, Design and Innnovation (New York, NY, USA, 2015), 15:1–15:8.

[18] Schurtz, D.R. et al. 2011. Exploring the social aspects of goose bumps and their role in awe and envy. Motivation and Emotion. 36, 2 (Sep. 2011), 205–217. DOI:https://doi.org/10.1007/s11031-011-9243-8.

[19] Sumpf, M. et al. 2015. Effects of Aesthetic Chills on a Cardiac Signature of Emotionality. PLOS ONE. 10, 6 (Jun. 2015), e0130117. DOI:https://doi.org/10.1371/journal.pone.0130117.

[20] Colver, M.C. and El-Alayli, A. 2016. Getting aesthetic chills from music: The connection between openness to experience and frisson. Psychology of Music. 44, 3 (May 2016), 413–427. DOI:https://doi.org/10.1177/0305735615572358.

[21] Drachen, A. et al. 2010. Correlation Between Heart Rate, Electrodermal Activity and Player Experience in First-person Shooter Games. Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games (New York, NY, USA, 2010), 49–54.

[22] Esling, P. and Agon, C. 2012. Time-series Data Mining. ACM Comput. Surv. 45, 1 (Dec. 2012), 12:1–12:34. DOI:https://doi.org/10.1145/2379776.2379788.

[23] Guennec, A.L. et al. 2016. Data Augmentation for Time Series Classification using Convolutional Neural Networks. (Sep. 2016).

[24] Chen, L.S. et al. 1998. Multimodal human emotion/expression recognition. (1998), 366–371.

[25] Sarkar, S. et al. 2016. Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning | PHM Society. INTERNATIONAL JOURNAL OF PROGNOSTICS AND HEALTH MANAGEMENT. 7, Special Issue Big Data and Analytics (Sep. 2016), 10.

[26] Tong, C. et al. 2017. A convolutional neural network based method for event classification in event-driven multi-sensor network. Computers & Electrical Engineering. (Jan. 2017). DOI:https://doi.org/10.1016/j.compeleceng.2017.01.005.

[27] Zhang, Y. et al. 2013. Multi-metric Learning for Multi-sensor Fusion Based Classification. Inf. Fusion. 14, 4 (Oct. 2013), 431–440. DOI:https://doi.org/10.1016/j.inffus.2012.05.002.

[28] Martínez, H.P. and Yannakakis, G.N. 2014. Deep Multimodal Fusion: Combining Discrete Events and Continuous Signals. Proceedings of the 16th International Conference on Multimodal Interaction (New York, NY, USA, 2014), 34–41.

[29] Bentley, T. et al. 2005. Evaluation Using Cued-recall Debrief to Elicit Information About a User’s Affective Experiences. Proceedings of the 17th Australia Conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future (Narrabundah, Australia, Australia, 2005), 1–10.[30] Yannakakis, G.N. and Martínez, H.P. 2015. Ratings are Overrated! Frontiers in ICT. 2, (2015). DOI:https://doi.org/10.3389/fict.2015.00013

[31] Martinez, H.P. et al. 2013. Learning deep physiological models of affect. IEEE Computational Intelligence Magazine. 8, 2 May 2013), 20–33. DOI:https://doi.org/10.1109/MCI.2013.2247823.

[32] Lugmayr, A. 2016. Emotive media: a review of emotional interfaces and media in human-computer-interaction. Proceedings of the 28th Australian Conference on Computer-Human Interaction (OzCHI ’16 ) (2016), 338–342. DOI: 10.1145/3010915.3010982

[33] Quesnel, D. and Riecke, B. 2017. Connected Through Awe: Can Interactive Virtual Reality Elicit Awe for Improved Well-Being? Poster Presented at the 3rd Annual Innovations in Psychiatry and Behavioral Health: Virtual Reality and Behavior Change (Stanford University, CA, Oct. 2017). DOI: 10.13140/RG.2.2.22177.10088

[34] Sun, M., Zhao, Z. and Ma, X. 2017. Sensing and Handling Engagement Dynamics in Human-Robot Interaction Involving Peripheral Computing Devices. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17 ) (2017), 556–567. DOI: 10.1145/3025453.3025469
Published
2018-01-11
How to Cite
QUESNEL, Denise; DIPAOLA, Steve; RIECKE, Bernhard E.. Deep Learning for Classification of Peak Emotions within Virtual Reality Systems. International SERIES on Information Systems and Management in Creative eMedia (CreMedia), [S.l.], n. 2017/2, p. 6-11, jan. 2018. ISSN 2341-5576. Available at: <https://www.ambientmediaassociation.org/Journal/index.php/series/article/view/274>. Date accessed: 19 mar. 2024.
Share |