Please use this identifier to cite or link to this item:
|Title:||Designing a visualization model to represent emotional changes through low-cost physiological sensors with genre perspective|
|Citation:||Cano, S., García, L., & Moreira, F. (2022). Designing a visualization model to represent emotional changes through low-cost physiological sensors with genre perspective. In A. Rocha, H. Adeli, G. Dzemyda, & F. Moreira (Eds.), Information Systems and Technologies. WorldCIST 2022. Lecture Notes in Networks and Systems, (vol. 470, pp. 294-304). Springer, Cham. https://doi.org/10.1007/978-3-031-04829-6_26. Repositório Institucional UPT. http://hdl.handle.net/11328/4389|
|Abstract:||Affective Computing is related to recognizing, interpreting, processing, and stimulating human emotions. On the other hand, emotions play an important role in communication and the expression feelings. Emotions can be manifested in a person through of physiological responses. In addition, the physiological responses can change depending on the emotion associated. Our interest is to visualize information, which can communicate the changes associated with an emotion. Therefore, to represent the information was used a visualization technique to reduce the data dimensionality called Self-Organizing Map and Chernoff Faces to represent multivariate data in the shape of a human face. Stimulating the human emotions, were applied a technique to induce emotions through visual stimuli, while physiological responses were captured for eighteen high-college students. Four views are proposed to represent changes in physiological responses associated with emotions, such as: disgust, happiness, sadness, excitement, and neutrality. In addition, the coherent analysis of the physiological data captured from the low-cost physiological sensors and the data from the applied questionnaires showed great consistency.|
|Appears in Collections:||REMIT - Publicações em Livros de Atas Internacionais / Papers in International Proceedings|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.