Musical Interactions

A special issue of Multimodal Technologies and Interaction (ISSN 2414-4088).

Deadline for manuscript submissions: closed (20 February 2021) | Viewed by 38539

Special Issue Editor


E-Mail Website
Guest Editor
School of Arts and Media, University of Salford, Salford M5 4WT, UK
Interests: human computer interaction; multimodal interfaces; interdisciplinary research; cognition; AI; simulation and modelling; multimedia performance
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Music is a structured sonic event for listening, which evolves around literature, performance repertoire and instrument design. Musical interaction occurs in a framework where a tight loop between sensory integration, cognitive appraisal and multimodal coordination is performed as time-critical processing, necessitated by social interaction, often only implied. Music making and music affect are conducive to multimodal associations, as reflected in song, dance and ceremony, and in movie and video game sound tracks.

Multimodality of musical performance and listening experience are well recognised in research across multimedia modelling, music information retrieval, music therapy, enactive interfaces, and new interfaces for musical expression. Proliferation of embedded systems for multimedia and action sensing includes mobiles and wearables, personal data appliances, equipment for medicine, sports and wellness, as well as game play and socialization. Across this spectrum, where is it desirable to apply musical interactions and the lessons we have learned?

Listeners’ auditory percept is transformed by seeing musicians play. Not unlike speech production and recognition, there is always an element of re-enacting motor production while listening. Playing while listening creates context for multimodal experience enhanced by AI and sensory signal processing. Musical interactions can be defined over a wide range of models and simulations, from physical and biological to linguistic and emotional. Players of all ages and skill levels may be engaged with musical play, which has known benefits ranging from increased social engagement to resilience from brain-related injury and aging.

This Special Issue aims to present perspectives from interdisciplinary research domains such as AI, HCI, multimodal interaction, playful interfaces, music supported therapy, and neuroscience to generate insights and better understandings of multisensory experience and multimodality through musical interaction, for a broader implication beyond the domain of music.

Dr. Insook Choi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Multimodal Technologies and Interaction is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Human computer interaction
  • AI
  • Multimodal interaction
  • Playful interfaces
  • Music supported therapy
  • Multimodal cognition
  • User-related studies
  • Music computation
  • Multimodal signal processing

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

19 pages, 1448 KiB  
Article
What Early User Involvement Could Look Like—Developing Technology Applications for Piano Teaching and Learning
by Tina Bobbe, Luca Oppici, Lisa-Marie Lüneburg, Oliver Münzberg, Shu-Chen Li, Susanne Narciss, Karl-Heinz Simon, Jens Krzywinski and Evelyn Muschter
Multimodal Technol. Interact. 2021, 5(7), 38; https://doi.org/10.3390/mti5070038 - 15 Jul 2021
Cited by 14 | Viewed by 4069
Abstract
Numerous technological solutions have been proposed to promote piano learning and teaching, but very few with market success. We are convinced that users’ needs should be the starting point for an effective and transdisciplinary development process of piano-related Tactile Internet with Human-in-the-Loop (TaHIL) [...] Read more.
Numerous technological solutions have been proposed to promote piano learning and teaching, but very few with market success. We are convinced that users’ needs should be the starting point for an effective and transdisciplinary development process of piano-related Tactile Internet with Human-in-the-Loop (TaHIL) applications. Thus, we propose to include end users in the initial stage of technology development. We gathered insights from adult piano teachers and students through an online survey and digital interviews. Three potential literature-based solutions have been visualized as scenarios to inspire participants throughout the interviews. Our main findings indicate that potential end users consider posture and body movements, teacher–student communication, and self-practice as crucial aspects of piano education. Further insights resulted in so-called acceptance requirements for each scenario, such as enabling meaningful communication in distance teaching, providing advanced data on a performer’s body movement for increased well-being, and improving students’ motivation for self-practice, all while allowing or even promoting artistic freedom of expression and having an assisting instead of judging character. By putting the users in the center of the fuzzy front end of technology development, we have gone a step further toward concretizing TaHIL applications that may contribute to the routines of piano teaching and learning. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Figure 1

35 pages, 535 KiB  
Article
Musical Control Gestures in Mobile Handheld Devices: Design Guidelines Informed by Daily User Experience
by Alexandre Clément, Luciano Moreira, Miriam Rosa and Gilberto Bernardes
Multimodal Technol. Interact. 2021, 5(7), 32; https://doi.org/10.3390/mti5070032 - 27 Jun 2021
Cited by 3 | Viewed by 3802
Abstract
Mobile handheld devices, such as smartphones and tablets, have become some of the most prominent ubiquitous terminals within the information and communication technology landscape. Their transformative power within the digital music domain changed the music ecosystem from production to distribution and consumption. Of [...] Read more.
Mobile handheld devices, such as smartphones and tablets, have become some of the most prominent ubiquitous terminals within the information and communication technology landscape. Their transformative power within the digital music domain changed the music ecosystem from production to distribution and consumption. Of interest here is the ever-expanding number of mobile music applications. Despite their growing popularity, their design in terms of interaction perception and control is highly arbitrary. It remains poorly addressed in related literature and lacks a clear, systematized approach. In this context, our paper aims to provide the first steps towards defining guidelines for optimal sonic interaction design practices in mobile music applications. Our design approach is informed by user data in appropriating mobile handheld devices. We conducted an experiment to learn links between control gestures and musical parameters, such as pitch, duration, and amplitude. A twofold action—reflection protocol and tool-set for evaluating the aforementioned links—are also proposed. The results collected from the experiment show statistically significant trends in pitch and duration control gesture mappings. On the other hand, amplitude appears to elicit a more diverse mapping approach, showing no definitive trend in this experiment. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Figure 1

21 pages, 1738 KiB  
Article
FeelMusic: Enriching Our Emotive Experience of Music through Audio-Tactile Mappings
by Alice Haynes, Jonathan Lawry, Christopher Kent and Jonathan Rossiter
Multimodal Technol. Interact. 2021, 5(6), 29; https://doi.org/10.3390/mti5060029 - 31 May 2021
Cited by 13 | Viewed by 5369
Abstract
We present and evaluate the concept of FeelMusic and evaluate an implementation of it. It is an augmentation of music through the haptic translation of core musical elements. Music and touch are intrinsic modes of affective communication that are physically sensed. By projecting [...] Read more.
We present and evaluate the concept of FeelMusic and evaluate an implementation of it. It is an augmentation of music through the haptic translation of core musical elements. Music and touch are intrinsic modes of affective communication that are physically sensed. By projecting musical features such as rhythm and melody into the haptic domain, we can explore and enrich this embodied sensation; hence, we investigated audio-tactile mappings that successfully render emotive qualities. We began by investigating the affective qualities of vibrotactile stimuli through a psychophysical study with 20 participants using the circumplex model of affect. We found positive correlations between vibration frequency and arousal across participants, but correlations with valence were specific to the individual. We then developed novel FeelMusic mappings by translating key features of music samples and implementing them with “Pump-and-Vibe”, a wearable interface utilising fluidic actuation and vibration to generate dynamic haptic sensations. We conducted a preliminary investigation to evaluate the FeelMusic mappings by gathering 20 participants’ responses to the musical, tactile and combined stimuli, using valence ratings and descriptive words from Hevner’s adjective circle to measure affect. These mappings, and new tactile compositions, validated that FeelMusic interfaces have the potential to enrich musical experiences and be a means of affective communication in their own right. FeelMusic is a tangible realisation of the expression “feel the music”, enriching our musical experiences. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Graphical abstract

16 pages, 2005 KiB  
Article
Using High-Performance Computers to Enable Collaborative and Interactive Composition with DISSCO
by Sever Tipei, Alan B. Craig and Paul F. Rodriguez
Multimodal Technol. Interact. 2021, 5(5), 24; https://doi.org/10.3390/mti5050024 - 05 May 2021
Cited by 2 | Viewed by 3320
Abstract
Composers do not usually collaborate with other composers but, for the last half century, open works were created that invite performers to implement details left undetermined or even decide the order in which various sections of the composition are to be played. Chance [...] Read more.
Composers do not usually collaborate with other composers but, for the last half century, open works were created that invite performers to implement details left undetermined or even decide the order in which various sections of the composition are to be played. Chance operations were also used in the writing of musical pieces and, in music generated with the assistance of computers, controlled randomness found its place. This article proposes a platform designed to encourage collaborative and interactive composition on high-performance computers with DISSCO (Digital Instrument for Sound Synthesis and Composition). DISSCO incorporates random procedures as well as deterministic means of defining the components of a piece. It runs efficiently on the Comet supercomputer of the San Diego Supercomputing Center and uses the Jupyter notebook environment to integrate the end-to-end processes with a user. These tools, the implementation platform, and the collaboration management are discussed in detail. Comments regarding aesthetic implications of the partnership between one or more humans and computer—considered a bona fide collaborator—are also provided. Possible future developments are supplied at the end. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Figure 1

25 pages, 1375 KiB  
Article
Comprehensive Framework for Describing Interactive Sound Installations: Highlighting Trends through a Systematic Review
by Valérian Fraisse, Marcelo M. Wanderley and Catherine Guastavino
Multimodal Technol. Interact. 2021, 5(4), 19; https://doi.org/10.3390/mti5040019 - 11 Apr 2021
Cited by 3 | Viewed by 4892
Abstract
We report on a conceptual framework for describing interactive sound installations from three complementary perspectives: artistic intention, interaction and system design. Its elaboration was informed by a systematic review of 181 peer-reviewed publications retrieved from the Scopus database, which describe 195 interactive sound [...] Read more.
We report on a conceptual framework for describing interactive sound installations from three complementary perspectives: artistic intention, interaction and system design. Its elaboration was informed by a systematic review of 181 peer-reviewed publications retrieved from the Scopus database, which describe 195 interactive sound installations. The resulting taxonomy is based on the comparison of the different facets of the installations reported in the literature and on existing frameworks, and it was used to characterize all publications. A visualization tool was developed to explore the different facets and identify trends and gaps in the literature. The main findings are presented in terms of bibliometric analysis, and from the three perspectives considered. Various trends were derived from the database, among which we found that interactive sound installations are of prominent interest in the field of computer science. Furthermore, most installations described in the corpus consist of prototypes or belong to exhibitions, output two sensory modalities and include three or more sound sources. Beyond the trends, this review highlights a wide range of practices and a great variety of approaches to the design of interactive sound installations. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Figure 1

22 pages, 906 KiB  
Article
The Power of Gaze in Music. Leonard Bernstein’s Conducting Eyes
by Isabella Poggi, Loredana Ranieri, Ylenia Leone and Alessandro Ansani
Multimodal Technol. Interact. 2020, 4(2), 20; https://doi.org/10.3390/mti4020020 - 20 May 2020
Cited by 5 | Viewed by 4735
Abstract
The paper argues for the importance and richness of gaze communication during orchestra and choir conduction, and presents three studies on this issue. First, an interview with five choir and orchestra conductors reveals that they are not so deeply aware of the potentialities [...] Read more.
The paper argues for the importance and richness of gaze communication during orchestra and choir conduction, and presents three studies on this issue. First, an interview with five choir and orchestra conductors reveals that they are not so deeply aware of the potentialities of gaze to convey indications in music performance. A conductor who was utterly conscious of the importance of gaze communication, however, is Leonard Bernstein, who conducted a performance of Haydn’s Symphony No. 88 using his face and gaze only. Therefore, a fragment of this performance is analyzed in an observational study, where a qualitative analysis singles out the items of gaze exploited by Bernstein and their corresponding meanings. Finally, a perception study is presented in which three of these items are submitted to expert, non-expert, and amateur participants. The results show that while the signal for “start” is fairly recognized, the other two, “pay attention” and “crescendo and accelerando” are more difficult to interpret. Furthermore, significant differences in gaze item recognition emerge among participants: experts not only recognize them more, but they also take advantage of viewing the items with audio-visual vs. video-only presentation, while non-experts do not take advantage of audio in their recognition. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Figure 1

15 pages, 3184 KiB  
Article
Promoting Contemplative Culture through Media Arts
by Jiayue Wu
Multimodal Technol. Interact. 2019, 3(2), 35; https://doi.org/10.3390/mti3020035 - 21 May 2019
Cited by 2 | Viewed by 3175
Abstract
This paper presents the practice of designing mediation technologies as artistic tools to expand the creative repertoire to promote contemplative cultural practice. Three art–science collaborations—Mandala, Imagining the Universe, and Resonance of the Heart—are elaborated on as proof-of-concept case studies. Scientifically, the empirical research [...] Read more.
This paper presents the practice of designing mediation technologies as artistic tools to expand the creative repertoire to promote contemplative cultural practice. Three art–science collaborations—Mandala, Imagining the Universe, and Resonance of the Heart—are elaborated on as proof-of-concept case studies. Scientifically, the empirical research examines the mappings from (bodily) action to (sound/visual) perception in technology-mediated performing art. Theoretically, the author synthesizes media arts practices on a level of defining general design principles and post-human artistic identities. Technically, the author implements machine learning techniques, digital audio/visual signal processing, and sensing technology to explore post-human artistic identities and give voice to underrepresented groups. Realized by a group of multinational media artists, computer engineers, audio engineers, and cognitive neuroscientists, this work preserves, promotes, and further explores contemplative culture with emerging technologies. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Figure 1

Other

Jump to: Research

35 pages, 862 KiB  
Perspective
An Introduction to Musical Interactions
by Insook Choi
Multimodal Technol. Interact. 2022, 6(1), 4; https://doi.org/10.3390/mti6010004 - 02 Jan 2022
Cited by 1 | Viewed by 3391
Abstract
The article presents a contextual survey of eight contributions in the special issue Musical Interactions (Volume I) in Multimodal Technologies and Interaction. The presentation includes (1) a critical examination of what it means to be musical, to devise the concept of [...] Read more.
The article presents a contextual survey of eight contributions in the special issue Musical Interactions (Volume I) in Multimodal Technologies and Interaction. The presentation includes (1) a critical examination of what it means to be musical, to devise the concept of music proper to MTI as well as multicultural proximity, and (2) a conceptual framework for instrumentation, design, and assessment of musical interaction research through five enabling dimensions: Affordance; Design Alignment; Adaptive Learning; Second-Order Feedback; Temporal Integration. Each dimension is discussed and applied in the survey. The results demonstrate how the framework provides an interdisciplinary scope required for musical interaction, and how this approach may offer a coherent way to describe and assess approaches to research and design as well as implementations of interactive musical systems. Musical interaction stipulates musical liveness for experiencing both music and technologies. While music may be considered ontologically incomplete without a listener, musical interaction is defined as ontological completion of a state of music and listening through a listener’s active engagement with musical resources in multimodal information flow. Full article
(This article belongs to the Special Issue Musical Interactions)
Show Figures

Graphical abstract

8 pages, 189 KiB  
Perspective
Representations, Affordances, and Interactive Systems
by Robert Rowe
Multimodal Technol. Interact. 2021, 5(5), 23; https://doi.org/10.3390/mti5050023 - 01 May 2021
Cited by 4 | Viewed by 3245
Abstract
The history of algorithmic composition using a digital computer has undergone many representations—data structures that encode some aspects of the outside world, or processes and entities within the program itself. Parallel histories in cognitive science and artificial intelligence have (of necessity) confronted their [...] Read more.
The history of algorithmic composition using a digital computer has undergone many representations—data structures that encode some aspects of the outside world, or processes and entities within the program itself. Parallel histories in cognitive science and artificial intelligence have (of necessity) confronted their own notions of representations, including the ecological perception view of J.J. Gibson, who claims that mental representations are redundant to the affordances apparent in the world, its objects, and their relations. This review tracks these parallel histories and how the orientations and designs of multimodal interactive systems give rise to their own affordances: the representations and models used expose parameters and controls to a creator that determine how a system can be used and, thus, what it can mean. Full article
(This article belongs to the Special Issue Musical Interactions)
Back to TopTop