Konferensartikel

Visualizing the Emotional Journey of a Museum

Shen Du
Ecole Centrale de Lyon, France

Edouard Shu
Ecole Centrale de Lyon, France

Feifei Tong
Ecole Centrale de Lyon, France

Yinghao Ge
Ecole Centrale de Lyon, France

Lu Li
Ecole Centrale de Lyon, France

Jingbo Qiu
Ecole Centrale de Lyon, France

Philippe Guillotel
Technicolor, Cesson-Sevigne, France

Julien Fleureau
Technicolor, Cesson-Sevigne, France

Fabien Danieau
Technicolor, Cesson-Sevigne, France

Daniel Muller
Ecole Centrale de Lyon, France

Ladda ner artikelhttp://dx.doi.org/10.3384/ecp10302

Ingår i: Proceedings of EmoVis 2016, ACM IUI 2016 Workshop on Emotion and Visualization, Sonoma, CA, USA, March 10, 2016

Linköping Electronic Conference Proceedings 103:2, s. 7-14

Visa mer +

Publicerad: 2016-03-01

ISBN: 978-91-7685-817-2

ISSN: 1650-3686 (tryckt), 1650-3740 (online)

Abstract

Wearable devices and new types of sensors make it possible to capture people behavior, activity and, potentially, cognitive state in their daily life. Today those devices are mainly used for well-being applications, by recording and displaying people’s activity. Some work have been published going a step further by inferring from the recorded signals the emotional state of individuals or group of people. However, the information provided and the way it is presented are still in their infancy, with time lined graphs showing calories, heart-rate, steps, temperature, and sometimes affective intensity.

In this paper we present an experiment done during the visit of different people in a museum of arts to capture the emotional impact of the exposed paintings. We also propose an associated visualization of their emotional journey. The emotion is here measured as the affective response to the paintings observation, and the processing algorithm is based on an existing technique adapted to the particular case of different observation durations. The visualization is based on a 3D map of the museum with different colors associated to the different paintings to get the emotional heat-map of the museum (more precisely the arousal dimension). The validation has been done in the museum of arts at Lyon, France, with 46 visitors, for a total of 27 paintings, exposed in three different rooms.

Nyckelord

Emotion; Visualization; Physiological responses; Data processing; Museum; Art;

Referenser

1. Boucsein, W. Electrodermal Activity (Second Edition). Springer, 2012.

2. Bradley, M. M., and Lang, P. J. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry 25 (1994), 49–59.

3. Fleureau, J., Guillotel, P., and Huynh-Thu, Q. Physiological-based affect event detector for entertainment video applications. IEEE Trans. on Affective Computing 99, PrePrints (2012).

4. Fleureau, J., Guillotel, P., and Orlac, I. Affective benchmarking of movies based on the physiological responses of a real audience. In Proc. Affective Computing and Intelligent Interaction (ACII 2013), IEEE (2013), 73–78.

5. Koelstra, S., M¨uhl, C., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., and Patras, I. Deap: A database for emotion analysis using physiological signals. IEEE Trans. on Affective Computing 3, 1 (2012), 18–31.

6. Krcadinac, U., Jovanovic, J., Devedzic, V., and Pasquier, P. Textual affect communication and evocation using abstract generative visuals. IEEE Trans. on Human-Machine Systems to appear (2015).

7. Lajante, M., Droulers, O., Dondaine, T., and Amarantini, D. Opening the black box of electrodermal activity in consumer neuroscience research. Journal of Neuroscience, Psychology, and Economics 5, 4 (2012), 238–249.

8. Lang, P. The emotion probe. American Psychologist 50, 5 (1995), 372–385.

9. LiKamWa, R., Liu, Y., Lane, N. D., and Zhong, L. Moodscope: Building a mood sensor from smartphone usage patterns. In Proc. 11th International Conference on Mobile Systems, Applications, and Services (MobiSys’13, ACM (2013), 389–402.

10. McDuff, D., El Kaliouby, R., Cohn, J. F., and Picard, R. Predicting ad liking and purchase intent: Large-scale analysis of facial responses to ads. IEEE Trans. on Affective Computing 11, 4 (2012).

11. Pantic, M., and Rothkrantz, L. Automatic Analysis of Facial Expressions: The State of the Art. IEEE PAMI 22, 12 (2000), 1424–1445.

12. Raqs Media Collective, van de Drift, M., Davis, S. B., van Kranenburg, R., Hope, S., and Stafford, T. Emotional Cartography - Technologies of the Self. Nold, Christian, 2009.

13. Saha, S., Datta, S., Konar, A., and Janarthanan, R. A study on emotion recognition from body gestures using kinect sensor. In Proc. International Conference on Communications and Signal Processing (ICCSP 2014), IEEE (2014), 56–60.

14. Soleymani, M., Pantic, M., and Pun, T. Multi-modal emotion recognition in response to videos. IEEE Trans. on Affective Computing 3, 2 (2011), 211–223.

15. Tr¨ondle, M., and Tschacher, W. The physiology of phenomenology: The effects of artworks. Journal of Empirical Studies of the Arts 30, 1 (2012), 79–117.

16. Tschacher, W., Kirchberg, V., van den Berg, K., Greenwood, S., Wintzerith, S., and Tr¨ondle, M. Physiological correlates of aesthetic perception of artworks in a museum. Psychology of Aesthetics, Creativity, and the Arts 6, 1 (2012), 96–103.

17. Wang, S., and Ji, Q. Video affective content analysis: A survey of state-of-the-art methods. IEEE Trans. on Affective Computing 6, 4 (2015), 410–430.

Citeringar i Crossref