Special Issue "Musical Interactions"
A special issue of Multimodal Technologies and Interaction (ISSN 2414-4088).
Deadline for manuscript submissions: 31 March 2019
Music is a structured sonic event for listening, which evolves around literature, performance repertoire and instrument design. Musical interaction occurs in a framework where a tight loop between sensory integration, cognitive appraisal and multimodal coordination is performed as time-critical processing, necessitated by social interaction, often only implied. Music making and music affect are conducive to multimodal associations, as reflected in song, dance and ceremony, and in movie and video game sound tracks.
Multimodality of musical performance and listening experience are well recognised in research across multimedia modelling, music information retrieval, music therapy, enactive interfaces, and new interfaces for musical expression. Proliferation of embedded systems for multimedia and action sensing includes mobiles and wearables, personal data appliances, equipment for medicine, sports and wellness, as well as game play and socialization. Across this spectrum, where is it desirable to apply musical interactions and the lessons we have learned?
Listeners’ auditory percept is transformed by seeing musicians play. Not unlike speech production and recognition, there is always an element of re-enacting motor production while listening. Playing while listening creates context for multimodal experience enhanced by AI and sensory signal processing. Musical interactions can be defined over a wide range of models and simulations, from physical and biological to linguistic and emotional. Players of all ages and skill levels may be engaged with musical play, which has known benefits ranging from increased social engagement to resilience from brain-related injury and aging.
This Special Issue aims to present perspectives from interdisciplinary research domains such as AI, HCI, multimodal interaction, playful interfaces, music supported therapy, and neuroscience to generate insights and better understandings of multisensory experience and multimodality through musical interaction, for a broader implication beyond the domain of music.
Dr. Insook Choi
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Multimodal Technologies and Interaction is an international peer-reviewed open access quarterly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) is waived for well-prepared manuscripts submitted to this issue. Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- Human computer interaction
- Multimodal interaction
- Playful interfaces
- Music supported therapy
- Multimodal cognition
- User-related studies
- Music computation
- Multimodal signal processing