We are in a series of studies, ranging from news production to computer gaming, looking into the intersection of transmodal interaction and user experience. The purpose of this abstract is to outline the theoretical framework for that intersection. The first area we are studying is Transmodal Interaction, which is a concept that refer to a specific aspect of multimodal interaction. Human action is multimodal (Streeck, Goodwin, & LeBaron, 2011), and different sensory modes play an important role in action. However, little attention has been given to the intricate ways in which sensory modalities (seeing – drawing, hearing – saying, moving – touching, etc.) integrate, affect, and transform each other during the course of an activity. There are transformations of meaning in every new materialisation of an idea or a thought, partly depending on the communication potential of the sensory modality. This render what we refer to as a transmodal process where ideas and thoughts materialise action by action in an emergent sequence across relatively long and discontinuous timespans (Murphy, 2012). Over a sequence of actions, the meanings expressed in one modality, dynamically blend and shape what is expressed in other modalities. This produces, according to (Murphy, 2012) “a series of semiotic modulations in which certain core qualities persist, but others are noticeably transformed in the transition from one mode to another. (p. 1969)” We can, in intersemiotic translation (Jakobson, 1959) between modalities, address what is lost, how we introduce distortions, or even introduce perceptions of things that do not exist. A question is then how continuity of meaning and experience is preserved in modality changes. The second area we are studying is User Experience. The term refers to a person's perceptions and responses resulting from the use and/or anticipated use of a product, system or service (ISO, 2010). We employ a three level model of user experience based on Leontiev’s account of consciousness (Kaptelinin & Nardi, 2012; Leont ́ev, 1978), which also relate closely to Norman’s model of emotional design (Norman, 2005). The first level is the sensory fabric of consciousness, Norman refers to this as the visceral level. It is the largely subconscious level of how things feel. The second level is the personal meaning of things, related to what and how we do things action by action. Norman (ibid.) refers to this level as the behavioural level. The third level has to do with meaning, and what Norman refers to as a reflective level. It is the level of cultural meaning and what things mean for us in our socially and historically rooted activities. The intersection of these two areas constitutes our current focus of research. We are, in domains as different as news production and computer gaming, investigating persons’ perceptions and actions resulting from interaction with each other and with materialisations across different sensory modalities that give rise to intersemiotic translation effects.
References ISO. (2010). ISO 9241-210: 2010 Ergonomics of human-system interaction -- Part 210: Human-centred design. Geneva: International Standardization Organization. Jakobson, R. (1959). On linguistic aspects of translation. In R. A. Brower (Ed.), On translation (pp. 232-239). Cambridge, Mass.: Harvard University Press. Kaptelinin, V., & Nardi, B. (2012). Activity theory in HCI: Fundamentals and Reflections. Synthesis Lectures on Human-Centered Informatics, 5(1), 1-105. Leont´ev, A. N. (1978). Activity, Consciousness, and Personality. Englewood Cliffs, NJ.: Prentice-Hall. Murphy, K. M. (2012). Transmodality and temporality in design interactions. Journal of Pragmatics, 44(14), 1966-1981. doi:10.1016/j.pragma.2012.08.013 Norman, D. A. (2005). Emotional design: Why we love (or hate) everyday things. New York, NY.: Basic Books. Streeck, J. r., Goodwin, C., & LeBaron, C. (2011). Embodied interaction: language and body in the material