Next Article in Journal
Unsupervised Keyphrase Extraction for Web Pages
Previous Article in Journal
Promoting Sustainable Energy Consumption Behavior through Interactive Data Visualizations
 
 
Correction published on 29 June 2020, see Multimodal Technol. Interact. 2020, 4(3), 34.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Accessible Digital Musical Instruments—A Review of Musical Interfaces in Inclusive Music Practice

KTH Royal Institute of Technology, Lindstedtsvägen 3, 100 44 Stockholm, Sweden
Multimodal Technol. Interact. 2019, 3(3), 57; https://doi.org/10.3390/mti3030057
Submission received: 1 April 2019 / Revised: 16 May 2019 / Accepted: 25 June 2019 / Published: 26 July 2019

Abstract

:
Current advancements in music technology enable the creation of customized Digital Musical Instruments (DMIs). This paper presents a systematic review of Accessible Digital Musical Instruments (ADMIs) in inclusive music practice. History of research concerned with facilitating inclusion in music-making is outlined, and current state of developments and trends in the field are discussed. Although the use of music technology in music therapy contexts has attracted more attention in recent years, the topic has been relatively unexplored in Computer Music literature. This review investigates a total of 113 publications focusing on ADMIs. Based on the 83 instruments in this dataset, ten control interface types were identified: tangible controllers, touchless controllers, Brain–Computer Music Interfaces (BCMIs), adapted instruments, wearable controllers or prosthetic devices, mouth-operated controllers, audio controllers, gaze controllers, touchscreen controllers and mouse-controlled interfaces. The majority of the AMDIs were tangible or physical controllers. Although the haptic modality could potentially play an important role in musical interaction for many user groups, relatively few of the ADMIs (14.5%) incorporated vibrotactile feedback. Aspects judged to be important for successful ADMI design were instrument adaptability and customization, user participation, iterative prototyping, and interdisciplinary development teams.

Graphical Abstract

1. Introduction

According to Article 27 of the Declaration of Human Rights [1], “Everyone has the right to freely participate in the cultural life of the community, to enjoy the arts and to share in scientific advancements and its benefits”. Moreover, Article 19 states that “Everyone has the right to freedom of opinion and expression (…)”. The World Health Organization’s International Classification of Functioning, disability and health (ICF) model, the most recent and comprehensive model of functioning and disability to date, recognizes the desirability of full participation in society [2]. In other words, everyone should, if they want to do so, have the right to express themselves through music. Despite this, music-making is still not easily accessible for everyone.
The history of electronic musical instruments (i.e., musical instruments that produce sound using electronic circuitry) dates back to the 19th century. Early examples of electronic instruments include the Telharmonium, Audion Piano, Thereminvox, Ondes Martenot, Trautonium and Novachord [3]. The idea of using music technology to promote inclusion can be traced back to the development of the Harold Rhodes electric piano [4]. The Rhodes Piano was an instrument designed to be used by patients in hospital bed trays during World War II. Since the development of this instrument, there have been numerous attempts towards creating music technology for people with health conditions or impairments. Up to the 1980s, most of these efforts focused on mechanical adaptations of acoustic musical instruments, but with the advancements of MIDI and microcomputers, hardware- and software developments coincided with developments in rehabilitation engineering, resulting in an increased interest in alternative controllers and music technology applications [5].
Recently, there has been a growing interest in the use of electronic technologies in music therapy [6]. According to Samuels [7], the availability of assistive music technology (AMT), accessible Digital Musical Instruments (DMIs), and mainstream adapted devices for persons with disabilities continues to grow and diversify. Today, there are several initiatives and organizations focusing on these topics, including Drake Music Project [8], OpenUp Music [9] and Adaptive Use Musical Instruments (AUMI) Project [10]. Several companies are specifically dedicated to producing DMIs for those with physical disabilities (e.g., Human Instruments [11]) and conferences such as OHMI Conference and Awards—Music and Physical Disability: From Instrument to Performance [12] are organized on the topic. Numerous musical instruments intended for use in music therapy settings are also commercially available (e.g., Soundbeam [13], Beamz [14], Skoog [15], Magic Flute [16], Quintet [17], Jamboxx Pro [18], BioVolt [19] and Groovtube [20], see Figure 1). Advances in open source technologies and low-cost DIY components have made customized musical tools easily accessible [21]. Together with software implementing machine learning methods for simple gesture recognition (see e.g., [22,23]), modern music technologies enable customization of DMIs so that they can be adapted to many user groups, regardless of their abilities, in a way that embraces diversity. In the Cambridge English Dictionary, the term diversity is defined as “the fact of many different types of things or people being included in something; a range of different things or people” [24]. Thanks to the availability of cheap technologies and hackable tools, custom-built instruments that relatively easy can be adapted to individual gestural capabilities can be produced. However, despite these technical advancements and increased possibilities, no full review has yet been published on the topic of DMIs for persons with diverse abilities.
In the current work, the term “persons with diverse abilities” is used to refer to user groups that are closed out from music-making activities. The premise of this review is that people are disabled only by two things: people’s perceptions and their environment. Such a wording transforms the view of the human body as a limit to an empowering unit with abilities. Thus, we can regard all people as musicians with a spectrum of different abilities. For example, in work presented by Machover [25], the author argues that many children have been excluded from active participation in music, especially when it comes to composing music, because of its theoretical and technical difficulty. Machover poses the question: “What if we could unlock the expressive mysteries of music first, before learning the technical foundations, if we could help young people—and others—fall in love with the joys of music first, subsequently demanding deeper knowledge once they were ‘hooked’?” While several studies have tried to address these issues (see for example work presented in [26,27]), this quote is still highly relevant, and it resonates well with the material presented in the current paper.
This review focuses on accessible musical control interfaces used in electronic music, inclusive music practice and music therapy settings, interfaces that are here referred to as Accessible Digital Musical Instruments (ADMIs) [28]. The work expands on the exploratory review presented in [28], in which proceedings from the International Conference on New Interfaces for Musical Expression (NIME), Sound and Music Computing Conference (SMC) and International Computer Music Conference (ICMC) were systematically analyzed to explore advancements, current state of developments, and trends in research on accessible DMIs. In this previous work, seven control interface types were identified: tangible, nontangible, audio, touchscreen, gaze, Brain–Computer Music Interfaces (BCMIs), and adapted instruments. The majority of these instruments were designed for persons with physical disabilities or children with health conditions or impairments. Although this previous study may give hints about the current state of developments in the field, it only focused on conference proceedings. In the current study, the aim is to expand on this previous work and include other relevant literature published in the field. The purpose is to identify current state of developments, user groups that may be under-represented, and areas of improvement. The current study goes beyond previously published review papers (see [29,30,31,32]) in the sense that it not only includes persons with physical disabilities or those with complex needs in special education (SEN) settings, but also other groups affected by any type of health condition. The main conclusion that could be drawn from the review presented in this paper is that the majority of the ADMIs were tangible or physical controllers. Although the haptic modality could potentially play an important role in musical interaction for many user groups, relatively few ADMIs incorporated vibrotactile feedback. If properly designed, Accessible Digital Musical Instruments could potentially be used as a tool for empowerment. Through this work, I hope to raise awareness on topics such as inclusion and diversity in the Computer Music practice, thereby enabling and promoting music-making for all.

2. Background

2.1. Digital Musical Instruments (DMIs)

Robert Moog defined the concept of a Digital Musical Instrument using a modular description consisting of three parts: “the sound generator, the interface between the musician and the sound generator and the tactile and visual reality of the instrument that makes a musician feel good when using it” [33]. Another definition was presented by Pressing [34], who viewed the DMI from a perspective of a control interface, a processor and output. The assumption that an electronic instrument consists only of an interface and a sound generator was challenged by Hunt et al. [35], who emphasized the importance of mapping between input and system parameters, suggesting that mappings can define the essence of an instrument. Similarly, Miranda and Wanderley [36] defined DMIs as instruments consisting of a control surface (a gestural or performance controller, an input device, or a hardware interface) and a sound generation unit. These different parts can be viewed as independent modules relating to each other by mapping strategies. Based on existing work, guidelines for mapping strategies (i.e., the designed link between an instrument’s playing interface and its sound source) were provided in yet another study conducted by Hunt and Wanderley [37]. Thinking of DMIs as modular systems that could be modified or adapted in terms of their elements, one can tailor instruments to certain individual’s capabilities and needs, both in terms of interaction with the system (sensor inputs and gestural capabilities) and other ways that the player may provide energy to the system [32]. In the current paper we discuss ADMIs [28], here defined as “accessible musical control interfaces used in electronic music, inclusive music practice and music therapy settings”.

2.2. Inclusive Music Practice

The terms used to describe research focusing on making DMIs accessible are many, and they also overlap; there appears to be no strong consensus or definition that is commonly agreed upon. For example, the term “adapted (adaptive) music” refers to the field of research concerned with the development and implementation of adaptations that facilitate full participation in music-making by people with health conditions or impairments [5]. It presumes that music-making in itself is a mode of human activity that requires no justification beyond its own praxis, considering music as one of the basic human rights [5]. Adapted music and music therapy are related areas that may be understood as distinct but overlapping [5]; music therapists use musical adaptations and have contributed significantly to the adapted music literature (see e.g., [38,39,40,41]). The term “adaptive music” is also used by Graham-Knight and Tzanetakis [30], who define it as the use of digital technologies to allow a person who cannot otherwise play a traditional musical instrument to play music unaided. Moreover, the word “adaptive” is used in a study by Vamvakousis and Ramirez [42], who specifically refer to the notion of “Adaptive Digital Musical Instruments”. Among these terms, you also encounter “inclusive music” [7], defined by Samuels as the use of music interfaces aimed at overcoming disabling barriers to music-making faced by people with disabilities [21]. These barriers can be viewed through two predominant theoretical models: the medical and the social model [43]. The medical model considers the disabling factor within a musician, whereas the social model perceives the exclusionary designs of musical interfaces and non-inclusive attitudes as disabling factors, thereby shifting focus to the implementation of techniques and assistive technologies in order to overcome barriers in music-making [21]. Moreover, the term “assistive music technology (AMT)” also appears in the literature (see e.g., [7,44,45,46]). However, as stated in a previous study [30], the word “assistance” implies an external source that provides aid to a person in need, whereas “adaptive” implies a constant state of refinement and adjustment to the musician. Finally, in recent work [47], Harrison and McPherson use the term “accessible instruments”, making a distinction between two categories of instruments designed for people with disabilities: “therapeutic devices” and “performance-focused instruments”. The term “accessible DMI” was also used in [7], among other terms.
A central concept in inclusive music practice is the notion of empowerment. Empowerment is the process of gaining control over events, outcomes and resources of importance to an individual or group [48]. As empowerment is always happening and unfolding in culture, there are several definitions emphasizing different aspects of the concept [49]. In music therapy, we might view empowerment as a concept relating to clinical processes enabling a client to participate [49]. The term “musical empowerment” should not be considered so much a process of acquiring a certain amount of culturally valued musical skills or resources, as a process of regaining rights to music [49]. From an embodied cognition paradigm point of view (in which the human motor system as well as gestures and body movements play a crucial role in the perception of music, see [50,51]), technology could be considered an extension of the body; a malleable tool that can be used by persons with restricted mobility or cognitive problems in order to stimulate self-expression, creative composition, and motor rehabilitation. In a recent study [52], Partesotti et al. refer to this sensation of control as “creative empowerment”: something that occurs when a continuous and cyclical interaction between user and technology is enabled. A person immersed in a DMI-based system can thus express her/himself in a way that strengthens experiences of resilience, while at the same time producing therapeutic benefits. To conclude, measurements of empowerment could focus on the ability to participate and control a musical instrument, either in a therapeutic process or a musical performance.
Research on inclusive music practice has often emphasized facilitated processes [53]. Historically, a key focus has been on MIDI controllers with switches that trigger acoustic events, such as e.g., Quintet [17] (see Figure 2). MIDI (Musical Instrument Digital Interface) is a specification of a communication scheme for music control and timing, developed for digital music devices (see [54]). Examples of pioneering work in this context includes the MidiGrid and MidiCreator. The MidiGrid system is a computer program developed to control electronic synthesizers over the MIDI standard [39]. The MidiCreator converts electrical signals from electrical transducer inputs into MIDI signals, which can then drive MidiGrid [39]. Today, new technologies and sensors have enabled a wide range of different and complex controllers, ranging from Brain–Computer Music Interfaces (BCMIs) to eye-controlled music interfaces and breath controllers.

2.3. Related Work

Adaptive music technology has been relatively unexplored in the Computer Music literature [30]. In a previous study [30], Graham-Knight and Tzanetakis presented an overview of existing work in the field. Informed by this overview, they propose a set of principles for how to work with a participant with disabilities in the development of a new musical instrument. These principles include introducing the participant to the technology, determining the range of motion of the participant, enabling the participant to produce sound quickly, developing a system for activating sounds that are reproducible, evolving a relationship with the participant that extends beyond music, making improvements incrementally and evolving a set of exercises that can be performed to increase mastery of the instrument. The authors suggest dividing DMIs into the following categories: touchless sensor musical instruments, breath pressure sensor musical instruments, biosensor musical instruments, video-based systems and “other” adaptive musical instruments. A related study focusing on musical instruments for people with physical disabilities was presented by Larsen et al. [29]. In this work, topics such as custom-designed instruments, augmentations/modifications of existing instruments and music-supported therapy were covered. The authors also discussed the potential of 100% adaptive instruments, customizable to specific user needs. More recent work includes the exploratory review by Partesotti et al. [52], focusing on DMIs based on motion tracking in the context of music therapy. The authors debate how specific features of DMIs could benefit music therapy sessions and propose future lines of research concerned with designing multimodal and empowerment-based technologies. Finally, work presented in [55] focused on accessible instruments for disability, considering both commercial products and instruments presented at NIME (International Conference on New Interfaces for Musical Expression) and related research. Results suggested that the commercial instruments mainly were MIDI controllers that, whatever their physical configuration, managed musical events on a note-by-note or sequence-level basis. In contrast, the academic projects provided the performer with higher level of control of precomposed or algorithmically generated material.
It has been claimed that the use of technology in clinical practice remains an inadequately researched topic [6]. In previous work [56], a total of 600 music therapists completed a survey about the use of music technology in clinical settings. Music therapists reported using music technology clinically, but many lacked formal training. Interestingly, more men than women or transgender music therapists reported using music technology in their practice. Some clinical recommendations regarding the opportunities and limitations of using technology in music therapy were given in the work by Magee and Burland [6]. Findings suggested that music therapists turn to technology to enable a client to participate actively, or to widen the client’s musical expression. However, technology was experienced as offering a lesser aesthetic experience than acoustic instruments. In another study conducted by Magee and Burland, interviews with music therapists experienced in using electronic equipment with specialized input devices for MIDI-generated sound were analyzed [57]. The authors presented results in the form of a suggested five-step treatment model for the use of technology with persons with complex needs. Guidelines for the use of music technologies in music therapy practice in pediatric medical settings have also been presented in [58].
Several literature reviews on the topic of music technology and music therapy have been conducted. For example, Crowe and Rio [59] reviewed the use of technology in music therapy practice, separating technologies into four categories: adapted musical instruments, recording technology, assistive technology for “the disabled” and technology-based music/sound healing practices. Naturally, most reviews on the use of technology in music therapy focus on particular populations of interest. In [31], Farrimond et al. presented a review of the use of music technology in SEN and disabled music settings. The authors argue that there has been a lack of robust, quantitative, longitudinal, studies in this field, and that the type of data collected in previous studies creates issues when assessing appropriateness or effectiveness of the technologies used to address barriers to participation. To be more precise, a disproportionate amount of research was based on case studies with very small groups of musicians with disabilities, or in SEN settings. The authors also identified five categories of musical control interfaces: distance- and motion tracking technology, touchscreen technology, tangible interfaces, wind controllers and biometrics (using e.g., Electromyography sensors, EMG). Other relevant work in this field includes the study presented in [32], in which issues raised in literature and practice of music technology in SEN settings were reviewed.

3. Method

3.1. Data Collection

This review aimed to identify and summarize studies that focus on ADMIs. In order to minimize the risk of bias, as there was only one person doing the literature search, the method was developed prior to the data collection. Initially, a table of key concepts covered by the research question was developed. After that, initial searches were carried out to find controlled vocabulary. Results are presented in Table 1. A comprehensive search on the following databases was subsequently carried out: Web of Science Core Collection, ScoPus, PubMed, Google Scholar, dbpl computer science bibliography, and Ergonomics abstracts. However, some of the databases were dropped as the retrieved search outcomes proved not to be relevant for the topic of interest. The final phrase used for searching was of the form: Concept 1 (OR synonyms) AND Concept 2 (OR synonyms) AND Concept 3 (OR synonyms). For the databases that provided such a feature, search was carried out using “all fields” setting (this includes title, keywords, and abstracts etc.). The following databases were included in the final search: ScoPus, Google Scholar, and Web of Science Core Collection. The searches resulted in 91 papers from ScoPus, 122 papers from Google Scholar and 11 papers from Web of Science. The final search phrases were: (“Digital Music* Instrument*” OR “New Interface* for Musical Expression” OR “music* interface*” OR “music controller”) AND (accessib* OR adapt* OR assistive OR inclus* OR empower*) AND (disabilit* OR health OR need OR impairment OR therap* OR disorder*). From this original dataset, studies were selected to be included in the review by comparing the information included in the title and abstracts against a set of inclusion criteria. In order to be included in the systematic review, the studies had to fulfill the following:
  • The study should present at least one ADMI.
  • The paper should focus primarily on ADMIs, mentioning the potential user group(s) in either the title or the abstract.
  • The paper should describe a practical implementation of an ADMI that enabled real-time manipulation of input or control data; theoretical papers and reviews were not included.
Publications considered were conference proceedings, journal papers, PhD theses and book chapters, written in English. Studies on (audio) games, sonification, or auditory augmentation were not included, as such papers did not primarily focus on the use of DMIs. No date restriction was employed; all studies were included regardless of year of publication. Apart from the papers that met the inclusion criteria specified above, the papers listed in the previous study [28] were also added to the dataset (as this earlier review used the same criteria for inclusion). Reference lists (i.e., titles) of the retrieved journal and conference papers, as well as reference lists of papers (not entire books) presented in the current paper’s Background section, were subsequently manually searched to find additional papers relevant for the review. Abstracts of pertinent papers were subsequently skimmed through to verify that met the defined inclusion criteria. In total, the final version of the dataset contained 113 publications. Approximately half of them came from cross-reference search, and half from initial searches in databases. The relatively large number of publications obtained from the cross-reference search is thought to be explained by the fact that many different terms are used to describe and define ADMIs. Among other terms, authors have described their instruments as “musicking tangibles”, “interactive installations”, “musical tangibles”, “musical environments”, “electronic musical instruments”, “adaptive instruments”, “accessible instruments” and “MMEs” (“multisensory multimedia environments”, see e.g., Snoezelen [60]), depending on research tradition.

3.2. Analysis

Naturally, the cross-reference search resulted in a set of publications that dealt with similar topics or related projects, i.e., material in which the same DMI was presented for example both in a conference and journal paper. Quantitative analysis was therefore mainly carried out on instrument level, so that the number of publications focusing on the same instrument would not bias descriptive statistics. For the papers included in the dataset, the DMIs were systematically analyzed from a set of predefined categories: control interface type, target user group, output modalities, music generation, and sound synthesis, sensor technologies, design process, and system evaluation. Sections and phrases from respective publications judged to be relevant for these categories were copy-pasted from the publications into an Excel sheet. The instruments were labeled and categorized using both quantitative (e.g., descriptive statistics) and qualitative measures (content analysis based on emergent themes).
In cases where an instrument was represented in several papers, the most recent publication was used when classifying and analyzing the ADMI (minor changes or iterations of the same instrument, such as e.g., providing different GUIs for a touch-pad instrument, were considered merely as modifications of the same instrument, not as two distinct instruments). Journal articles were preferred over conference proceeding papers. For cases in which several DMIs were presented in the same setting, e.g., in a multisensory environment or installation context, instruments were considered as separate DMIs. Some papers were excluded since they did not contain enough detailed descriptions of the instruments in question. Some instruments were also left out since their papers did not provide enough detailed information about the instrument’s design or function.

4. Results

4.1. Publications

A plot depicting the number of publications per year is displayed in Figure 3. The most recent publication was published in 2018, the oldest one in 1990. A Pearson Correlation coefficient was computed to assess the relationship between the number of publications and year. There was a positive correlation between the two variables, r = 0.77 , n = 23 , p = < 6.637 × 10 6 . As seen in Table 2, the majority of the publications were published as conference proceedings, followed by journal papers. The heterogeneity of the field was illustrated by the range of different venues and journals in which the authors decided to publish their work. The most common conference was the International Conference on New Interfaces for Musical Expression (18 papers), followed by the International Conference on Computer Music (8 papers) and the Sound and Music Computing Conference (7 papers). Interestingly, the most common journal was International Journal on Disability and Human Development. A full list of publication details is available as Supplementary Material.

4.2. Control Interface Type

In total, 83 instruments were identified. Interfaces were divided into categories based on main mode of interaction. A plot illustrating the separation into control interface categories can be seen in Figure 4. Percentages represent the ratio of respective category in relation to the total number of ADMIs in the dataset (i.e., not in relation to the total number of publications). In cases where two or more means of interaction were used, such as touch and vocals, the one judged to be most important for the musical interaction and sound synthesis was selected as main mode. The defined control interfaces categories were: tangible controllers (physical objects that can be touched or set in motion in order to produce sounds), touchless controllers (using sensors or cameras that enable spatial position detection and identification of body movements; these movements did not have to involve actively touching objects), adapted instruments (existing instruments that have been augmented or modified, e.g., hyperinstruments [61] and actuated instruments [62] with mechanical actuation), Brain–Computer Music Interfaces (BCMIs [36], using biometric sensors such as electrodes to monitor Electroencephalography, EEG, signals), gaze controllers (using eye-tracking), touchscreen controllers, audio controllers (using microphones as input), mouth-operated interfaces (e.g., wind controllers using breath sensors), wearable controllers and prosthetic devices (instruments that could be worn or act as an extension of the player’s body [63]) and mouse-controlled interfaces (using a conventional computer mouse, see also Music Mouse by Laurie Spiegel [64]). Number of ADMIs per respective category, followed by a brief example of a respective instrument, is presented in Table 3. As in previous work [28], the majority of the ADMIs were classified as tangible interfaces/physical controllers. Examples of interfaces belonging to respective category are also presented in Figure 5.

4.3. Sensor and Actuator Use

The most common sensor technologies are summarized in Table 4. The most frequently used sensors were touch sensors, followed by accelerometers and cameras of various sorts. Considering the distribution of control interface categories described in the previous section, the results presented in Table 4 are not very surprising. It is worth noticing that the historically important switch-operated technologies were rather frequently used, even though these often provide only binary control signals. Interestingly, not all authors built their own circuits based on microcontrollers and similar technologies, instead, they used available commercial technologies in their work. Examples of technologies used were Nintendo Wii-mote, Microsoft Kinect, Soundbeam, LeapMotion [81] and iPod touch. Regarding actuators, the most frequently used ones were solenoid motors, motorized faders, vibration actuators, or vibration loudspeakers. LED lights were often used to provide simultaneous visual feedback.

4.4. Output Modalities

Out of the 83 instruments, a total of 39 (47.0%) presented unimodal feedback. Only one instrument was said to present only vibratory feedback (this instrument was however placed in a multisensory room and was thus surrounded by multiple speakers producing sounds based on other interactions), the rest presented only sound. A total of 40 ADMIs (48.2%) presented bimodal feedback. Out of these, 33 were auditory-visual, providing for example an interactive GUI, animations or visualizations on a screen, projections or LED lights, and 7 were auditory-vibrotactile. Only 4 instruments presented trimodal feedback (auditory + visual + vibrotactile), corresponding to 4.8% of all ADMIs. In total, 37 ADMIs (44.6%) provided some kind of visual feedback. Correspondingly, 12 ADMIs (14.5%) incorporated vibrotactile feedback.

4.5. Target User Group

Several authors mentioned that their DMIs could be used by everyone, but that they could be of special interest to users with diverse abilities. Some user groups were more frequently represented than others. For example, 39.8% of the ADMIs were explicitly said to focus on user groups with physical impairments, and 28.9% of the ADMIs were developed for young users and children. Only 4.8% of the ADMIs had older adults as user group. No ADMIs for infants were mentioned. Moreover, only 3.6% of the ADMIs focused on persons with visual impairment, while 6.0% focused on persons with hearing impairment. Other user groups that were mentioned were persons with learning difficulties (9.6%), autism spectrum disorder (8.4%), special needs (7.2%), complex needs (6.0%), cerebral palsy (6.0%) as well as persons who cannot communicate verbally (6.0%). In addition, some ADMIs were developed for persons with cognitive impairments (4.8%) or persons with quadriplegia (4.8%). The majority of the ADMIs were designed to be used by a single person. However, there were many examples of ADMIs designed to played together in a music therapy setting. Rather few were large-scale instruments intended to be used by larger groups of people simultaneously. Interestingly, 20.5% of the instruments had been used in performances or concerts at least one time.

4.6. Music and Sound Synthesis Control

Interestingly, aspects concerned with sound synthesis and control were often left out or discussed very briefly, in many of the papers. More than a third of the ADMIs were not accompanied by a detailed description of the produced sounds. Moreover, discussions on mapping strategies and motivations for implementation of certain sound models were rarely provided. A substantial part of the ADMIs appear to have been designed using a one-to-one mapping strategy [36] in which one synthesis parameter is driven by one gestural parameter. Depending on the sensor technologies used, the ADMIs varied in level of control. In general, note sequences produced by instruments provided by MIDI libraries (see e.g., [82]) were common. Several ADMIs provided generation of different scales [69] or chords [83]. Prerecorded sounds and samples [84], as well as real-time modulation of recorded vocals [85], were also used. Only a few ADMIs could produce other types of more complex sonic material. Examples included computationally controlled soundscapes [86,87], textures [88], granular synthesis [89], percussive sounds produced by physical modeling techniques [65], abstract sounds produced through FM synthesis, modulation tones and ring modulator effects [90] and modified vocals, piano and live-electronics from a pre-composed piece [91].

4.7. Design Process and System Evaluation

Most of the ADMIs in the reviewed dataset were developed using iterative design processes or participatory design methods, emphasizing the importance of including users in ongoing tests throughout the instrument’s development process. Such activities involved for example co-development with therapists and clients through design-workshops, music sessions, and pilot tests in which users tried out prototypes. However, preparatory work also involved interviews and surveys with music therapists and teachers as well as observations of user groups. Pilot studies often focused on evaluating if a specific interface could successfully be used by users, or on quantifying gestural capabilities of the intended user group in relation to a particular interface. These experiments were usually performed to inform the design of the final instrument. The involvement of users and music therapists appear to have been of great importance in most cases; the development process often involved several cycles in which multiple prototypes were repeatedly tested and modified.
In terms of evaluation, several approaches have been proposed for DMIs (see e.g., [92,93]). Perhaps due to the heterogeneity of the papers discussed in this review, a range of different evaluation methods were adopted. The studies varied in terms of number of participants, as well as duration, with projects ranging from weeks to several years. Examples of evaluation methods included iterative prototype evaluations and interviews with users, observations of users interacting with respective instrument, questionnaires, latency tests, gestural data logging, usage logging as well as concerts and performances. Overall, the field appears to lack a formal framework for evaluation; many different evaluation protocols were proposed. Nevertheless, the vast majority of the ADMIs involved some type of evaluation, even if different methods were employed. Interestingly, evaluations with both potential user groups and neurotypical persons were conducted.
Many of the systems reviewed in the current study were on prototype stages. Reports of continuous use of the ADMI after the initial study were rare and there were relatively few accounts of long-term use of the instruments. However, some instruments appear to have been used for longer periods of time. Examples include the MidiGrid and MidiCreator [94], the Musicking Tangibles in the RHYME project [45], and the Sound=Space music installation by Rolf Gehlhaar [77]. The MidiGrid and MidiCreator were used in several studies throughout the years (see e.g., [39,95,96]). The RHYME project [45] involved several studies and iterative generations of tangible instruments. This five-year long project was based on a user-oriented research approach, in which participants influenced the development process. Different evaluation methods were used, for example interviews, video-recorded observations, and multidisciplinary discussions. Sound=Space [77] was an electronic musical instrument that could be played by one or several persons at the same time by moving in an empty space, set up as a permanent installation. Numerous workshops, flexible in their nature, were conducted with this instrument.

5. Discussion

This systematic review has presented a comprehensive introduction to the field of ADMIs. Interestingly, results suggested an increase in number of publications per year. In addition, findings indicated that most of the digital instruments developed to promote inclusion in music-making activities were physical, tangible, controllers, or touchless controllers. Consequently, the most common sensor technologies used were touch sensors, accelerometers, and cameras. These findings go partly in line with results presented in the previous study [28], although the dataset reviewed in that paper only focused on conference proceedings and thus does not represent the full picture. Moreover, the work presented in the current paper resonates with previous findings by Medeiros and Wanderley [97], suggesting that DMIs often are based on a few basic sensory types and that more advanced sensing methods are rare for these instruments. For example, sensor fusion was rarely mentioned, and machine learning methods were seldom used when mapping sensor input to sound synthesis parameters. Moreover, motion capture techniques were relatively rare. It is likely that the choice of technologies was guided by the fact that most authors advocate for simple and affordable technical solutions. For such purposes, many modern DIY technologies and cheap sensors are readily available.
In terms of target user groups, one can conclude that little attention has been paid to older adults or really young children in this context. Also, relatively little has been done for persons with visual or hearing impairment. Overall, the majority of the ADMIs were designed for persons with physical impairments and children. Interestingly, despite this, the majority of the ADMIs were tangible controllers.
Almost half of the instruments discussed in this review presented only auditory output. In other words, no actuators or visual systems were used to enhance the music produced by these instruments. This is somewhat surprising, considering that multisensory interfaces could be beneficial for several potential user groups with diverse abilities. Traditional musical instruments allow for multisensory feedback and control. For example, playing a guitar allows you to feel the vibrations of the instrument, while at the same time seeing the movements of the strings and hearing the sonic output. Potentially, the availability of multisensory feedback could thus enhance the sensation of being in control of the creation of sounds, as well as re-enforce the relationship between cause and effect in musical interaction. Interestingly, children with learning disabilities have been found to do their best when presented with learning and creating music in a multisensory environment, as the better functioning modes of learning helped the children to compensate for the dysfunctioning modes [98]. Moreover, so-called multisensory environments (MSSEs) have successfully been used for therapeutic purposes for example for persons with learning disabilities [99], dementia and anxiety [100], as well as in chronic pain [101]. Several theories have been put forward to account for how and why multisensory therapy can have positive effects on people with multiple disabilities. For example, it has been suggested that individuals with multiple disabilities often have little or limited opportunity to control the environment around them, resulting in feelings of helplessness [102]. A multisensory therapy environment gives these individuals the opportunity to experience control over their environment. In other words, exploiting the haptic modality could potentially be beneficial for certain user groups. Despite the above-mentioned research, only four instruments in total were trimodal, presenting auditory, visual, and vibrotactile feedback. However, it should be noted that some groups might not benefit from certain types of multimodal feedback, as this might be perceived as overwhelming or distracting. For example, children with autism can have hypo- and hypersensitivity to sensory stimulus, with tolerance for e.g., hue and frequency bands that are either below or above average. Such responsiveness to individuals’ hypo- and hypersensitivities must therefore be addressed by the musical device.
In terms of control, it may for some user groups be beneficial to limit what can be achieved when interacting with an ADMI. For others, it might be essential that the instrument promotes long-term engagement and the possibility to improve over time, offering the musician complexity and a potential for virtuosic control. Being able to develop skills and virtuosity through practice has been advocated as a design goal in the New Interfaces for Musical Expression field (see [103]). However, in the context of inclusive music practice, the focus could also be on developing instruments that simply respond to any movement with aesthetically pleasing sounds, as mentioned in the study presented in [104]. Perhaps this is the reason why several of the ADMIs in this survey focused on triggering relatively simple sound models.
One of the main conclusions that can be drawn from this review is that it is not possible to define precise design guidelines for ADMIs since different user groups have very different needs. In other words, AMDIs should be constructed in a manner that enables customization and adaptation to every user’s specific needs. Intra-personal differences and preferences may be very dissimilar also for persons within the same user group. Interestingly, systems that have proved to be successful (in terms of being used during longer periods of time) appear to have allowed for a rather high level of customization. For these interfaces, the musical material could be changed depending on user and context. Moreover, participatory design approaches and adaptability appear to have been important for these instruments. For example, in one of the studies [77], authors stressed that the facilitators needed a great deal of malleability and adjustment to specific situations when working with Sound=Space. Moreover, according to the authors, the RHYME project [45] built on a multidisciplinary approach, with perspectives from tangible interaction design and inspiration from resource-oriented music therapy and empowerment thinking. To conclude, some key concepts for these instruments were adaptability and customization, user participation, and interdisciplinary development teams.
An unexpected finding from the current study was that descriptions of the implemented sound synthesis models often were completely left out in the papers, or only discussed very briefly. Perhaps customization connected to, for example, the physical requirements of the user were the main concern in these contexts, and the sound design came secondary. Nevertheless, the participatory design process involved in the development of an ADMI should not only focus on the user’s gestural capabilities, but also on different sonic outputs and musical possibilities provided by the instrument. Many stress the importance of providing adaptive instruments customized to any user’s specific needs. However, little attention in this context seem to have been given to the empowerment through sounds provided by creating instruments rich in audible output and sonic ranges. Determining how sounds are perceived by some user groups might be a challenge, for example if the users cannot communicate verbally. In such cases, the interpretation of facial and bodily expressions is extra important and help from personal assistants and/or teachers might be required to determine how different sounds are perceived in the musical interaction taking place.
The heterogeneity of the ADMI field leads to many different terms being used to describe similar technologies and methods, which makes the review process rather complex. This heterogeneity is also reflected in the wide range of venues in which these papers are published. Consequently, design strategies as well as research evaluation methods differ largely between different papers. There appears to be a need for a more unified evaluation framework for ADMIs. In addition, it would be interesting to see more work focusing on how ADMIs evolve over time, i.e., the effects of continuous use and mastering of ADMIs.
Of course, there are numerous examples of already existing musical instruments that have not specifically been developed with the purpose of promoting inclusion in music-making, but that have great potential in doing so, for example through adaptation to gestural constraints of a particular user group. Many different instruments, ranging from the Theremin to modern touchless controllers, virtual instruments, and malleable interfaces for music-making, could potentially be adapted to promote inclusion. There is a large body of research within the computer music community that could be applied and adapted to create more inclusive musical interfaces that enable music-making for all, regardless of user needs and abilities. With the current work I wish to inspire to future work within the field of inclusive music practice by providing suggestions for how the field of ADMIs could be expanded through the use of technologies and methods, thereby enabling music-making for under-represented groups.

Limitations

The fact that this review work was conducted by only one author might of course influence the reliability of the results presented in this paper. The author therefore found it vital to define methodology, concepts and classifications before beginning the work of collecting papers for the review, so that the methodology could be reproduced by someone else. Even if formal definitions were given, the data selection was course influenced by the author’s conception of what can be considered a “Digital Musical Instrument”. In addition, it should be noted that the cross-reference search methodology employed in this study resulted in the inclusion of many papers written by the same authors. Moreover, the inclusion criteria were restricted to content in titles and abstracts. Another possible method for cross-reference search could have been to go through more recent publications that cited the papers included in the original dataset. Nevertheless, based on the results and repeated searches on databases using different phrases, the author believes that the data presented in this paper gives an accurate summary of current trends within the field.

6. Conclusions

In this systematic review of Accessible Digital Musical Instruments (AMDIs), a total of 113 papers, presenting 83 different DMIs, were analyzed. Ten categories of control interface types were identified: tangible controllers, touchless controllers, Brain–Computer Musical Instruments (BCMIs), adapted instruments, wearable controllers or prosthetic devices, mouth-operated controllers, audio controllers, gaze controllers, touchscreen controllers and mouse-controlled interfaces. The most common category of instruments was tangible controllers, followed by touchless controllers. Consequently, the most commonly used sensors in this context were touch sensors, accelerometers, and cameras. A total of 47.0% of the instruments presented unimodal feedback, 48.2% presented bimodal feedback, and 4.8% presented trimodal feedback. Interestingly, only 14.5% ADMIs incorporated vibrotactile feedback, while 45.5% provided visual. Many of the ADMIs focused on children and persons with physical disabilities. Relatively few instruments were designed specifically for persons with visual or hearing impairment, elderly or young children. A substantial part of the instruments discussed in this review did not include detailed descriptions of sound synthesis methods or mapping strategies. In terms of sounds, note sequences produced using MIDI libraries were rather common. Overall, the ADMIs were often developed using participatory design methods. However, there appears to be a need for more longitudinal studies based on systematic evaluation frameworks. Some key concepts for success of ADMIs are adaptability and customization, user participation, iterative prototyping, and interdisciplinary development teams. Based on the findings presented in this review, one can conclude that the field concerned with making music accessible to everyone through design of ADMIs is rather heterogeneous. This affects both methods and technology use. With the advancements of new, affordable, and DIY-focused music technology, there is a potential for the field to become more diverse and for ADMIs to benefit larger groups of users. This could be achieved for example by incorporating vibrotactile feedback, exploring more diverse sound synthesis methods, implementing more advanced sensing technologies and gesture acquisition, as well as making use of more diverse mapping strategies. To conclude, there is a large body of research in the Computer Music community that could be successfully adopted in ADMI research to enable inclusive music-making for all.

Supplementary Materials

The following data is available online at https://www.mdpi.com/2414-4088/3/3/57/s1: List of reviewed papers.

Funding

This research was carried out as part of the author’s PhD research carried out at the KTH Royal Institute of Technology.

Acknowledgments

The author would like to thank the KTH Royal Institute of Technology Library for their support.

Conflicts of Interest

The author declares no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ADMIAccessible Digital Musical Instrument
AMTAssistive Music Technology
BCMIBrain–Computer Music Interface
DMIDigital Musical Instrument
EEGElectroEncephaloGraphy
EMGElectroMyoGraphy
MSSEMultiSenSory Environment
NIMENew Interface for Musical Expression
SENSpecial EducatioN

References

  1. Assembly, U.G. Universal Declaration of Human Rights; UN General Assembly: New York, NY, USA, 1948. Available online: https://www.un.org/en/udhrbook/pdf/udhr_booklet_en_web.pdf (accessed on 29 June 2019).
  2. McDougall, J.; Wright, V.; Rosenbaum, P. The ICF Model of Functioning and Disability: Incorporating Quality of Life and Human Development. Dev. Neurorehabil. 2010, 13, 204–211. [Google Scholar] [CrossRef] [PubMed]
  3. Holmes, T.B. Electronic Music Before 1945. In Electronic and Experimental Music; Scribner: New York, NY, USA, 1985; Chapter 1; pp. 3–42. [Google Scholar]
  4. Adlers, F. Fender Rhodes: The Piano That Changed the History of Music. Available online: http://www.fenderrhodes.com/history/narrative.html (accessed on 29 June 2019).
  5. Knox, R. Adapted Music as Community Music. Int. J. Community Music. 2004, 1, 247–252. [Google Scholar]
  6. Magee, W.L.; Burland, K. Using Electronic Music Technologies in Music Therapy: Opportunities, Limitations and Clinical Indicators. Br. J. Music Ther. 2008, 22, 3–15. [Google Scholar] [CrossRef]
  7. Samuels, K. Enabling Creativity: Inclusive Music Interfaces and Practices. In Proceedings of the International Conference on Live Interfaces (ICLI), Lisbon, Portugal, 19–23 November 2014. [Google Scholar]
  8. Drake Music Project. Available online: https://www.drakemusic.org/ (accessed on 29 June 2019).
  9. Open Up Music. Available online: http://openupmusic.org/ (accessed on 29 June 2019).
  10. Adaptive Use Musical Instruments (AUMI) Project. Available online: http://aumiapp.com/ (accessed on 29 June 2018).
  11. Human Instruments. Available online: https://www.humaninstruments.co.uk/ (accessed on 29 June 2019).
  12. OHMI Conference and Awards—Music and Physical Disability: From Instrument to Performance. Available online: https://www.ohmi.org.uk/ (accessed on 29 June 2019).
  13. Soundbeam. Available online: https://www.soundbeam.co.uk/ (accessed on 29 June 2019).
  14. Beamz. Available online: https://thebeamz.com/ (accessed on 29 June 2019).
  15. Skoog Music. Available online: http://skoogmusic.com/ (accessed on 29 June 2019).
  16. Magic Flute. Available online: http://housemate.ie/magic-flute/ (accessed on 29 June 2018).
  17. Quintet. Available online: http://housemate.ie/quintet/ (accessed on 29 June 2019).
  18. Jamboxx Pro. Available online: https://www.jamboxx.com/ (accessed on 29 June 2019).
  19. BioControl Systems. Available online: http://www.biocontrol.com/products.html (accessed on 29 June 2019).
  20. GroovTube. Available online: http://housemate.ie/groovtube/ (accessed on 29 June 2019).
  21. Samuels, K. The Meanings in Making: Openness, Technology and Inclusive Music Practices for People with Disabilities. Leonardo Music J. 2015, 25, 25–29. [Google Scholar] [CrossRef]
  22. Fiebrink, R.; Cook, P.R. The Wekinator: A system for Real-Time, Interactive Machine Learning in Music. In Proceedings of the Eleventh International Society for Music Information Retrieval Conference (ISMIR 2010), Utrecht, The Netherlands, 9–13 August 2010. [Google Scholar]
  23. Katan, S.; Grierson, M.; Fiebrink, R. Using Interactive Machine Learning to Support Interface Development through Workshops with Disabled People. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems, Seoul, South Korea, 18–23 April 2015; pp. 251–254. [Google Scholar] [CrossRef]
  24. Cambridge Dictionary. Available online: https://dictionary.cambridge.org/dictionary/english/diversity (accessed on 29 June 2019).
  25. Machover, T. Shaping Minds Musically. BT Technol. J. 2004, 22, 171–179. [Google Scholar] [CrossRef]
  26. Hayes, L. Sound, Electronics, and Music: A Radical and Hopeful Experiment in Early Music Education. Comput. Music J. 2017, 41, 36–49. [Google Scholar] [CrossRef]
  27. Webster, P.R. Young Children and Music Technology. Res. Stud. Music Educ. 1998, 11, 61–76. [Google Scholar] [CrossRef]
  28. Frid, E. Accessible Digital Musical Instruments: A Survey of Inclusive Instruments Presented at the NIME, SMC and ICMC Conferences. In Proceedings of the International Computer Music Conference 2018, Daegu, South Korea, 5–10 August 2018; pp. 53–59. [Google Scholar]
  29. Larsen, J.V.; Overholt, D.; Moeslund, T.B. The Prospects of Musical Instruments for People with Physical Disabilities. In Proceedings of the International Conference on New Instruments for Musical Expression, Brisbane, Australia, 11–15 June 2016; pp. 327–331. [Google Scholar]
  30. Graham-Knight, K.; Tzanetakis, G. Adaptive Music Technology: History and Future Perspectives. In Proceedings of the International Conference on Computer Music, Denton, TX, USA, 25 September–1 October 2015; pp. 416–419. [Google Scholar]
  31. Farrimond, B.; Gillard, D.; Bott, D.; Lonie, D. Engagement with Technology in Special Educational & Disabled Music Settings. Youth Music Report. December 2011, pp. 1–40. Available online: https://network.youthmusic.org.uk/file/5694/download?token=I-1K0qhQ (accessed on 29 June 2019).
  32. Ward, A.; Woodbury, L.; Davis, T. Design Considerations for Instruments for Users with Complex Needs in SEN Settings. In Proceedings of the New Interfaces for Musical Expression, Copenhagen, Denmark, 15–18 May 2017; Bournemouth University: Fern Barrow, Poole, Dorset, UK, 2017. [Google Scholar]
  33. Moog, R. The Musician: Alive and Well in the World of Electronics. In The Biology of Music Making: Proceedings of the 1984 Denver Conference; MMB Music, Inc.: St Louis, MO, USA, 1988; pp. 214–220. [Google Scholar]
  34. Pressing, J. Cybernetic Issues in Interactive Performance Systems. Comput. Music J. 1990, 14, 12–25. [Google Scholar] [CrossRef]
  35. Hunt, A.; Wanderley, M.M.; Paradis, M. The Importance of Parameter Mapping in Electronic Instrument Design. J. New Music Res. 2003, 32, 429–440. [Google Scholar] [CrossRef]
  36. Miranda, E.R.; Wanderley, M.M. Musical Gestures: Acquisition and Mapping. In New Digital Instruments: Control and Interaction beyond the Keyboard; A-R Editions, Inc.: Middleton, WI, USA, 2006; p. 3. [Google Scholar]
  37. Hunt, A.; Wanderley, M.M. Mapping Performer Parameters to Synthesis Engines. Organ. Sound 2002, 7, 97–108. [Google Scholar] [CrossRef]
  38. Correa, A.G.D.; Ficheman, I.K.; do Nascimento, M.; de Deus Lopes, R. Computer Assisted Music Therapy: A case Study of an Augmented Reality Musical System for Children with cerebral Palsy Rehabilitation. In Proceedings of the Ninth IEEE International Conference on Advanced Learning Technologies, 2009 (ICALT 2009), Riga, Latvia, 15–17 July 2009; pp. 218–220. [Google Scholar]
  39. Kirk, R.; Abbotson, M.; Abbotson, R.; Hunt, A.; Cleaton, A. Computer Music in the Service of Music therapy: The MIDIGRID and MIDICREATOR Systems. Med. Eng. Phys. 1994, 16, 253–258. [Google Scholar] [CrossRef]
  40. Krout, R.; Burnham, A.; Moorman, S. Computer and Electronic Music Applications with Students in Special Education: From Program Proposal to Progress Evaluation. Music Ther. Perspect. 1993, 11, 28–31. [Google Scholar] [CrossRef]
  41. Spitzer, S. Computers and Music Therapy: An Integrated Approach: Four Case Studies. Music Ther. Perspect. 1989, 7, 51–54. [Google Scholar] [CrossRef]
  42. Vamvakousis, Z.; Ramirez, R. The EyeHarp: A gaze-controlled digital musical instrument. Front. Psychol. 2016, 7, 906. [Google Scholar] [CrossRef] [PubMed]
  43. Lubet, A. Music, Disability, and Society; Temple University Press: Philadelphia, PA, USA, 2011. [Google Scholar]
  44. Magee, W.L. Music Technology in Therapeutic and Health Settings; Jessica Kingsley Publishers: London, UK; Philadelphia, PA, USA, 2014. [Google Scholar]
  45. Cappelen, B.; Andersson, A.P. Designing four generations of ‘Musicking Tangibles’. In Music, Health, Technology and Design; NMH-Publications: Oslo, Norway, 2014. [Google Scholar]
  46. Challis, B. Octonic: An Accessible Electronic Musical Instrument. Digit. Creat. 2011, 22, 1–12. [Google Scholar] [CrossRef]
  47. Harrison, J.; McPherson, A.P. Adapting the Bass Guitar for One-Handed Playing. J. New Music Res. 2017, 46, 270–285. [Google Scholar] [CrossRef]
  48. Fawcett, S.B.; White, G.W.; Balcazar, F.E.; Suarez-Balcazar, Y.; Mathews, R.M.; Paine-Andrews, A.; Seekins, T.; Smith, J.F. A Contextual-Behavioral Model of Empowerment: Case Studies Involving People with Physical Disabilities. Am. J. Community Psychol. 1994, 22, 471–496. [Google Scholar] [CrossRef] [PubMed]
  49. Rolvsjord, R. Therapy as Empowerment: Clinical and political Implications of Empowerment Philosophy in Mental Health Practises of Music Therapy. Nord. J. Music Ther. 2004, 13, 99–111. [Google Scholar] [CrossRef]
  50. Leman, M.; Maes, P.J. The Role of Embodiment in the Perception of Music. Empir. Musicol. Rev. 2015, 9, 236–246. [Google Scholar] [CrossRef] [Green Version]
  51. Godøy, R.I.; Leman, M. Musical Gestures: Sound, Movement, and Meaning; Routledge: Abingdon, UK, 2010. [Google Scholar]
  52. Partesotti, E.; Peñalba, A.; Manzolli, J. Digital Instruments and their Uses in Music Therapy. Nord. J. Music Ther. 2018, 27, 399–418. [Google Scholar] [CrossRef]
  53. Anderson, T.; Smith, C. Composability: Widening Participation in Music Making for People with Disabilities via Music Software and Controller Solutions. In Proceedings of the 2nd Annual ACM Conference on Assistive Technologies, Vancouver, BC, Canada, 11–12 April 1996; ACM: New York, NY, USA, 1996; pp. 110–116. [Google Scholar]
  54. Moog, R.A. MIDI: Musical Instrument Digital Interface. J. Audio Eng. Soc 1986, 34, 394–404. [Google Scholar]
  55. McPherson, A.; Morreale, F.; Harrison, J. Musical Instruments for Novices: Comparing NIME, HCI and Crowdfunding Approaches. In New Directions in Music and Human-Computer Interaction; Holland, S., Mudd, T., Wilkie-McKenna, K., McPherson, A., Wanderley, M.M., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 179–212. [Google Scholar]
  56. Hahna, N.D.; Hadley, S.; Miller, V.H.; Bonaventura, M. Music Technology Usage in Music Therapy: A Survey of Practice. Arts Psychother. 2012, 39, 456–464. [Google Scholar] [CrossRef]
  57. Magee, W.L.; Burland, K. An Exploratory Study of the Use of Electronic Music Technologies in Clinical Music Therapy. Nord. J. Music Ther. 2008, 17, 124–141. [Google Scholar] [CrossRef]
  58. Whitehead-Pleaux, A.M.; Clark, S.L.; Spall, L.E. Indications and Counterindications for Electronic Music Technologies in a Pediatric Medical Setting. Music Med. 2011, 3, 154–162. [Google Scholar] [CrossRef]
  59. Crowe, B.J.; Rio, R. Implications of Technology in Music Therapy Practice and Research for Music Therapy Education: A Review of Literature. J. Music Ther. 2004, 41, 282–320. [Google Scholar] [CrossRef] [PubMed]
  60. Snoezelen Multi-Sensory Environments. Available online: http://www.snoezelen.info/ (accessed on 29 June 2019).
  61. Machover, T. Hyperinstrument: Musically Intelligent and Interactive Performance and Creativity Systems. In Proceedings of the 1989 International Computer Music Conference, Ann Arbour, MI, USA, 1–6 October 1989; pp. 186–190. [Google Scholar]
  62. Overholt, D.; Berdahl, E.; Hamilton, R. Advancements in Actuated Musical Instruments. Organ. Sound 2011, 16, 154–165. [Google Scholar] [CrossRef]
  63. Hattwick, I.; Malloch, J.; Wanderley, M.M. Forming Shapes to Bodies: Design for Manufacturing in the Prosthetic Instruments. In Proceedings of the International New Interfaces for Musical Expression Conference, London, UK, 30 June–4 July 2014; pp. 443–448. [Google Scholar]
  64. Spiegel, L. Music Mouse. Available online: http://retiary.org/ls/programs.html (accessed on 29 June 2019).
  65. Jense, A.; Leeuw, H. WamBam: A Case Study in Design for an Electronic Musical Instrument for Severely Intellectually Disabled Users. In Proceedings of the International Conference on New Interfaces for Musical Expression, Baton Rouge, LA, USA, 31 May–3 June 2015; pp. 74–77. [Google Scholar]
  66. Gumtau, S.; Newland, P.; Creed, C. MEDIATE—A Responsive Environment Designed for Children with Autism. In Proceedings of the Accessible Design in a Digital World, Dundee, Scotland, 23–25 August 2005. [Google Scholar]
  67. Knapp, R.B.; Lusted, H.S. A Bioelectric Controller for Computer Music Applications. Comput. Music J. 1990, 14, 42–47. [Google Scholar] [CrossRef]
  68. Harrison, J.; McPherson, A. An Adapted Bass Guitar for One-Handed Playing. In Proceedings of the International Conference on New Interfaces for Musical Expression, Aalborg, Denmark, 15–18 May 2017; Aalborg University Copenhagen: Copenhagen, Denmark, 2017; pp. 507–508. [Google Scholar]
  69. Wilde, D. Extending Body and Imagination: Moving to Move. Int. J. Disabil. Hum. Dev. 2011, 10, 31–36. [Google Scholar] [CrossRef]
  70. Gehlhaar, R.; Rodrigues, P.M.; Girão, L.M.; Penha, R. Instruments for Everyone: Designing New Means of Musical Expression for Disabled Creators. In Technologies of Inclusive Well-Being; Springer: Heidelberg, Germany, 2014; pp. 167–196. [Google Scholar]
  71. Ellis, P. Vibroacoustic Sound Therapy: Case Studies with Children with Profound and Multiple Learning Difficulties and the Elderly in Long-Term Residential Care. Stud. Health Technol. Inform. 2004, 103, 36–42. [Google Scholar]
  72. Favilla, S.; Pedell, S. Touch Screen Collaborative Music: Designing NIME for Older People with Dementia. Environment 2014, 20, 27. [Google Scholar]
  73. Meckin, D.; Bryan-Kinns, N. MoosikMasheens: Music, Motion and Narrative with Young People Who Have Complex Needs. In Proceedings of the 12th International Conference on Interaction Design and Children, New York, NY, USA, 24–27 June 2013; ACM: New York, NY, USA, 2013; pp. 66–73. [Google Scholar]
  74. Søderberg, E.A.; Odgaard, R.E.; Bitsch, S.; Høeg-Jensen, O.; Schildt, N.; Christensen, S.D.P.; Gelineck, S. Music Aid-Towards a Collaborative Experience for Deaf and Hearing People in Creating Music. In Proceedings of the International Conference on New Interfaces for Musical Expression, Brisbane, Australia, 11–15 July 2016. [Google Scholar]
  75. Miranda, E.R.; Boskamp, B. Steering Generative Rules with the EEG: An Approach to Brain-Computer Music Interfacing. In Proceedings of the Sound and Music Computing Conference, Salerno, Italt, 24–25 November 2005; Volume 5. [Google Scholar]
  76. Aziz, A.D.; Warren, C.; Bursk, H.; Follmer, S. The Flote: An Instrument for People with Limited Mobility. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Halifax, NS, Canada, 13–15 October 2008; ACM: New York, NY, USA, 2008; pp. 295–296. [Google Scholar]
  77. Almeida, A.P.; Girão, L.M.; Gehlhaar, R.; Rodrigues, P.M.; Rodrigues, H.; Neto, P.; Mónica, M. Sound = Space Opera: Choreographing Life within an Interactive Musical Environment. Int. J. Disabil. Hum. Dev. 2011, 10, 49–53. [Google Scholar] [CrossRef]
  78. MidiGrid. Available online: http://midigrid.com/ (accessed on 29 June 2019).
  79. Bresin, R.; Elblaus, L.; Frid, E.; Favero, F.; Annersten, L.; Berner, D.; Morreale, F. Sound Forest/Ljudskogen: A Large-Scale String-Based Interactive Musical Instrument. In Proceedings of the Sound and Music Computing Conference, Hamburg, Germany, 31 August–3 September 2016; pp. 79–84. [Google Scholar]
  80. Wilde, D. The hipdiskettes: Learning (through) Wearables. In Proceedings of the 20th Australasian Conference on Computer-Human Interaction: Designing for Habitus and Habitat, Cairns, Australia, 8–12 January 2008; pp. 259–262. [Google Scholar]
  81. Leap Motion. Available online: https://www.leapmotion.com/ (accessed on 29 June 2019).
  82. Benveniste, S.; Jouvelot, A.P.; Lecourt, A.E.; Michel, A.R. Designing Wiimprovisation for Mediation in Group Music Therapy with Children Suffering from Behavioral Disorders. In Proceedings of the International Conference on Interaction Design and Children, Milan, Italy, 3–5 June 2009; pp. 18–26. [Google Scholar]
  83. Kirwan, N.J.; Overholt, D.; Erkut, C. Bean: A Digital Musical Instrument for Use in Music Therapy. In Proceedings of the Sound and Music Computing Conference, Salerno, Italy, 24–26 November 2015; pp. 49–54. [Google Scholar]
  84. Rigler, J.; Seldess, Z. The Music Cre8tor: An Interactive System for Musical Exploration and Education. In Proceedings of the International Conference on New Interfaces for Musical Expression, New York, NY, USA, 6–10 June 2007; ACM: New York, NY, USA, 2007; pp. 415–416. [Google Scholar]
  85. Cappelen, B.; Andersson, A.P. Embodied and distributed parallel DJing. Stud. Health Technol. Inform. 2016, 229, 528–539. [Google Scholar] [CrossRef] [PubMed]
  86. Matossian, V.; Gehlhaar, R. Human Instruments: Accessible Musical Instruments for People with Varied Physical Ability. Annu. Rev. Cyberther. Telemed. 2015, 13, 200–205. [Google Scholar]
  87. Mann, S.; Fung, J.; Garten, A. Deconcert: Bathing in the Light, Sound, and Waters of the Musical Brainbaths. In Proceedings of the International Computer Music Conference, Copenhagen, Denmark, 27–31 August 2007. [Google Scholar]
  88. Zubrycka, J.; Cyrta, P. Morimo. Tactile Sound Aesthetic Experience. In Proceedings of the International Computer Music Conference, Ljubljana, Slovenia, 9–15 September 2012; pp. 423–425. [Google Scholar]
  89. Miranda, E.R.; Magee, W.L.; Wilson, J.J.; Eaton, J.; Palaniappan, R. Brain-Computer Music Interfacing (BCMI): From Basic Research to the Real World of Special Needs. Music Med. 2011, 3, 134–140. [Google Scholar] [CrossRef] [Green Version]
  90. Riley, P.; Alm, N.; Newell, A. An Interactive Tool to Promote Musical Creativity in People with Dementia. Comput. Hum. Behav. 2009, 25, 599–608. [Google Scholar] [CrossRef]
  91. Eaton, J.; Williams, D.; Miranda, E. The Space Between Us: Evaluating a Multi-User Affective Brain-Computer Music Interface. Brain-Comput. Interfaces 2015, 2, 103–116. [Google Scholar] [CrossRef]
  92. O’modhrain, S. A Framework for the Evaluation of Digital Musical Instruments. Comput. Music J. 2011, 35, 28–42. [Google Scholar] [CrossRef]
  93. Wanderley, M.M.; Orio, N. Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI. Comput. Music J. 2002, 26, 62–76. [Google Scholar] [CrossRef]
  94. Hunt, A.; Kirk, R. MidiGrid: Past, Present and Future. In Proceedings of the 2003 Conference on New Interfaces for Musical Expression, Montreal, QC, Canada, 22–24 May 2003; pp. 135–139. [Google Scholar]
  95. Hunt, A.; Kirk, R.; Neighbour, M. Multiple Media Interfaces for Music Therapy. IEEE Multimedia 2004, 11, 50–58. [Google Scholar] [CrossRef]
  96. Hunt, A.; Kirk, R.; Abbotson, M.; Abbotson, R. Music Therapy and Electronic Technology. In Proceedings of the 26th Euromicro Conference. Informatics: Inventing the Future, Maastricht, The Netherlands, 5–7 September 2000; Volume 2, pp. 362–367. [Google Scholar] [CrossRef]
  97. Medeiros, C.B.; Wanderley, M.M. A Comprehensive Review of Sensors and Instrumentation Methods in Devices for Musical Expression. Sensors 2014, 14, 13556–13591. [Google Scholar] [CrossRef]
  98. McCord, K. Children with Special Needs Compose Using Music Technology. J. Technol. Music Learn. 2002, 1, 3–14. [Google Scholar]
  99. Slevin, E.; Mcclelland, A. Multisensory Environments: Are they Therapeutic? A Single-Subject Evaluation of the Clinical Effectiveness of a Multisensory Environment. J. Clin. Nurs. 1999, 8, 48–56. [Google Scholar] [CrossRef] [PubMed]
  100. Sánchez, A.; Maseda, A.; Marante-Moar, M.P.; de Labra, C.; Lorenzo-López, L.; Millán-Calenti, J.C. Comparing the Effects of Multisensory Stimulation and Individualized Music Sessions on Elderly People with Severe Dementia: A Randomized Controlled Trial. J. Alzheimers Dis. 2016, 52, 303–315. [Google Scholar] [CrossRef] [PubMed]
  101. Schofield, P. The Effects of Snoezelen on Chronic Pain. Nurs. Stand. 2000, 15, 33–34. [Google Scholar] [CrossRef] [PubMed]
  102. Baillon, S.; van Diepen, E.; Prettyman, R. Multisensory Therapy in Psychiatric Care. Adv. Psychiatr. Treat. 2002, 8, 444–450. [Google Scholar] [CrossRef]
  103. Dobrian, C.; Koppelman, D. The ‘E’ in NIME: Musical Expression with New Computer Interfaces. In Proceedings of the International Conference on New interfaces for Musical Expression, Paris, France, 4–8 June 2006; pp. 277–282. [Google Scholar]
  104. Bergsland, A.; Wechsler, R. Composing Interactive Dance Pieces for the Motioncomposer, a Device for Persons with Disabilities. In Proceedings of the International Conference on New interfaces for Musical Expression (NIME2015), Baton Rouge, LA, USA, 31 May–3 June 2015; pp. 20–23. [Google Scholar]
Figure 1. Examples of ADMIs. Top row: GroovTube [20]. Middle row: Left: SoundBeam [13], courtesy of Charlton Park Academy. Right: Skoog [15]. Bottom row: Touch Chord by Human Instruments [11].
Figure 1. Examples of ADMIs. Top row: GroovTube [20]. Middle row: Left: SoundBeam [13], courtesy of Charlton Park Academy. Right: Skoog [15]. Bottom row: Touch Chord by Human Instruments [11].
Mti 03 00057 g001
Figure 2. Quintet, a musical instrument that allows connection of up to five standard accessibility switches [17].
Figure 2. Quintet, a musical instrument that allows connection of up to five standard accessibility switches [17].
Mti 03 00057 g002
Figure 3. Number of publications per year.
Figure 3. Number of publications per year.
Mti 03 00057 g003
Figure 4. ADMIs per control interface category. Categories defined by main mode of interaction.
Figure 4. ADMIs per control interface category. Categories defined by main mode of interaction.
Mti 03 00057 g004
Figure 5. Examples of different control interfaces. Top row: Left: GuitarMasheen, an adapted instrument in the form of an electro-mechanically actuated guitar, by Dave Meckin [73]. Right: GUI of the EyeHarp, a gaze controller [42]. Second row: Left: MusicAid, a touchscreen controller [74]. Middle: Jamboxx Pro, a mouth-operated controller [18]. Right: BCMI Piano, a Brain–Computer Music Interface [75]. Third row: Left: The Flote, an audio controller [76]. Middle: Sound=Space installation, a touchless interface by Rolf Gehlhaar [77]. Right: MidiGrid, a mouse-controlled musical interface [78]. Last row: Left: Sound Forest, a music installation with tangible controllers in the form of interactive strings [79]. Right: HipDiskettes, a performance ensemble interacting with wearable HipDisk controllers [80].
Figure 5. Examples of different control interfaces. Top row: Left: GuitarMasheen, an adapted instrument in the form of an electro-mechanically actuated guitar, by Dave Meckin [73]. Right: GUI of the EyeHarp, a gaze controller [42]. Second row: Left: MusicAid, a touchscreen controller [74]. Middle: Jamboxx Pro, a mouth-operated controller [18]. Right: BCMI Piano, a Brain–Computer Music Interface [75]. Third row: Left: The Flote, an audio controller [76]. Middle: Sound=Space installation, a touchless interface by Rolf Gehlhaar [77]. Right: MidiGrid, a mouse-controlled musical interface [78]. Last row: Left: Sound Forest, a music installation with tangible controllers in the form of interactive strings [79]. Right: HipDiskettes, a performance ensemble interacting with wearable HipDisk controllers [80].
Mti 03 00057 g005
Table 1. Keywords used when searching for publications. Stars* denote wildcards which broaden the search by finding words that begin with the same letters; e.g., “sound*” finds both “sound” and “sounds”.
Table 1. Keywords used when searching for publications. Stars* denote wildcards which broaden the search by finding words that begin with the same letters; e.g., “sound*” finds both “sound” and “sounds”.
Concept 1Concept 2Concept 3
Key Conceptsmusical instrumentaccessibilitydisability
Free text termsdigital musical instrumentadaptivehealth
new interface for musical expressionadaptedneed
musical interfaceassistiveimpairment
control interfaceinclusiontherapy
empowermentdisorder
Search phrases“music* instrument*”“accessib*”“disab*”
“digital music* instrument*”“adaptive”“health”
“new interface* for musical expression”“adapted”“need*”
“music* interface*”“assistive”impair*”
“control interface*”“inclus*”“therap*”
“empower*”“disorder*”
Table 2. Publications per Category.
Table 2. Publications per Category.
CategoryPublicationsPercentage
Conference Proceedings6053.1%
Journal Articles3732.7%
Book Chapters1311.5%
PhD Theses32.7%
Table 3. Control interface categories defined by main mode of interaction. Numbers correspond to the total number of ADMIs belonging to respective interface category.
Table 3. Control interface categories defined by main mode of interaction. Numbers correspond to the total number of ADMIs belonging to respective interface category.
Interface CategoryTotADMI Example
Tangible30WamBam, an electronic hand-drum for music therapy sessions with persons with severe intellectual disabilities [65].
Touchless20MEDIATE, a multisensory environment designed for an interface between autistic and typical expression, using infrared cameras fed to the EyesWeb system to produce interactive output [66].
BCMIs9Biomuse, a “biocontroller” that can be used to augment normal musical instrument performance or as a computer interface for musical composition and performance. Could be used by paralyzed and movement impaired individuals as a means to regain pleasures of music-making [67].
Adapted Instruments8A modification of the electric bass guitar, designed for users with upper-limb disabilities. Enables MIDI-controlled actuated fretting via a foot pedal control [68].
Wearable/prosthetic5HipDisk, a disk-shaped body-worn device designed to inspire people to swing their hips and explore their full range of movement through a simultaneous, interdependent exploration of sound [69].
Mouth-operated3Doosafon, a mouth and head-operated devices similar to a xylophone, using a lightweight baton held in the mouth, linked to a breath sensor [70].
Audio2A standard digital sound processor and microphone intended for vocal interaction, used together with a vibroacoustic chair producing vibrations based on the interaction [71].
Gaze2The EyeHarp, a gaze-controlled DMI which aims to enable people with severe motor disabilities to learn, perform, and compose music using their gaze as control mechanism [42].
Touchscreen2A collaborative music system based on a touchscreen controller, designed for persons with dementia. [72]
Mouse-controlled2The MidiGrid system, allowing a screen cursor (computer mouse) to be passed over cells so that the musical content of the selected cells is played out on a synthesizer attached to the computer via MIDI [39].
Table 4. Frequently Used Sensors.
Table 4. Frequently Used Sensors.
TypeOccurrence
Touch14
Accelerometer13
Camera13
Microphone10
Button9
Pressure8
Switch7
Ultrasonic8
Piezo7
Electrodes (for EEG)6
Bend5
Breath4
Computer mouse/trackball3
RFID reader4
Infrared3

Share and Cite

MDPI and ACS Style

Frid, E. Accessible Digital Musical Instruments—A Review of Musical Interfaces in Inclusive Music Practice. Multimodal Technol. Interact. 2019, 3, 57. https://doi.org/10.3390/mti3030057

AMA Style

Frid E. Accessible Digital Musical Instruments—A Review of Musical Interfaces in Inclusive Music Practice. Multimodal Technologies and Interaction. 2019; 3(3):57. https://doi.org/10.3390/mti3030057

Chicago/Turabian Style

Frid, Emma. 2019. "Accessible Digital Musical Instruments—A Review of Musical Interfaces in Inclusive Music Practice" Multimodal Technologies and Interaction 3, no. 3: 57. https://doi.org/10.3390/mti3030057

Article Metrics

Back to TopTop