Next Article in Journal
Exploring the Legality of Artists’ Use of Animals: Ethical Considerations and Legal Implications
Next Article in Special Issue
Feel the Music!—Audience Experiences of Audio–Tactile Feedback in a Novel Virtual Reality Volumetric Music Video
Previous Article in Journal
More than a Man, Less than a Painter: David Smith in the Popular Press, 1938–1966
Previous Article in Special Issue
Tactual Articulatory Feedback on Gestural Input
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Exploring the Opportunities of Haptic Technology in the Practice of Visually Impaired and Blind Sound Creatives

1
Dyson School of Design Engineering, Imperial College, South Kensington, London SW7 2DB, UK
2
Sonic Arts Research Centre, Queen’s University Belfast, 4 Cloreen Park, Belfast BT9 5HN, UK
*
Authors to whom correspondence should be addressed.
Arts 2023, 12(4), 154; https://doi.org/10.3390/arts12040154
Submission received: 30 January 2023 / Revised: 22 March 2023 / Accepted: 29 March 2023 / Published: 13 July 2023
(This article belongs to the Special Issue Feeling the Future—Haptic Audio)

Abstract

:
Visually impaired and blind (VIB) people as a community face several access barriers when using technology. For users of specialist technology, such as digital audio workstations (DAWs), these access barriers become increasingly complex—often stemming from a vision-centric approach to user interface design. Haptic technologies may present opportunities to leverage the sense of touch to address these access barriers. In this article, we describe a participant study involving interviews with twenty VIB sound creatives who work with DAWs. Through a combination of semi-structured interviews and a thematic analysis of the interview data, we identify key issues relating to haptic audio and accessibility from the perspective of VIB sound creatives. We introduce the technical and practical barriers that VIB sound creatives encounter, which haptic technology may be capable of addressing. We also discuss the social and cultural aspects contributing to VIB people’s uptake of new technology and access to the music technology industry.

1. Introduction

This article explores the access barriers and wider issues salient to the workflows of visually impaired and blind (VIB) sound creatives, which are typically centred around the digital audio workstation (DAW).
We use the term ‘VIB sound creatives’ to denote anyone with a visual impairment who works creatively with sound, whether on a professional or hobbyist level. VIB people exist on a broad spectrum of sight loss, which includes not only complete blindness but also varying degrees of light perception, peripheral vision and the refractive issues of myopia and hyperopia. The term sound creative attempts to capture the plethora of diverse and distinct sound-oriented practices that people engage with creatively, such as music production, audio engineering, composing and DJing.
VIB sound creatives comprise a community of practitioners for whom existing music-technology software, particularly DAWs, present several accessibility barriers. Accessibility tools commonly used by VIB people include screen-reading software, keyboard macros and braille displays. While these tools effectively provide access to digital materials, such as text-based content, significant gaps remain between a sighted person’s experience of specialist software, such as a DAW, and that of a VIB person.
Our research highlights how these gaps arise, primarily affecting how VIB users perceive the content of their DAW and the efficiency with which they can manipulate software features. Through the analysis of interviews with twenty VIB sound creatives, we consider whether haptic technology represents an untapped potential for improving VIB sound creatives’ access to DAWs. We consider how haptics might integrate with a VIB sound creative’s existing workflow and indeed whether this reflects the priorities of the broader community.
The United Nations deem accessibility in all aspects of life to be a human right1. The International Standard Organisation (ISO) describes accessibility as the “extent to which products, systems, services, environments and facilities can be used by people from a population with the widest range of characteristics and capabilities to achieve a specified goal in a specified context of use.” (ISO 2011).
The ISO considers this a widely-held definition but also highlights that some standards consider accessibility to be interwoven with the notion of usability, i.e., “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.” (ibid.). We, too, acknowledge that the entangled nature of these two definitions warrants further investigation; however, this is not the focus of this article and, instead, something we aim to address in a future publication.

2. Background

2.1. Accessible Music Technology

In this article, we define accessible music technology (AMT) as any software or hardware elements that VIB sound creatives incorporate into their workflow. The specificity of this definition is crucial as the practice of VIB sound creatives is often underpinned by a complex artefact ecology (Bødker and Klokmose 2011), including mainstream software and hardware (sometimes adapted), open-source software extensions and scripts. It is essential to capture this reality, regardless of whether the technology was initially designed with accessibility in mind.

2.2. Screen Readers

Most VIB sound creatives use a text-to-speech (TTS) engine called a screen reader to access their computers (Payne et al. 2020). The screen reader allows someone to navigate and control a computer system using a combination of keyboard shortcuts and navigation commands and hear the visible elements of the graphical user interface (GUI) spoken aloud. In this way, someone can use a computer without needing to interact with a mouse or look at the screen.
For MacOS users, the only available screen reader is Apple’s VoiceOver2. VoiceOver is fully integrated with the MacOS architecture, comes free with every computer and cannot be replaced with a third-party product. On Windows, there are two primary screen readers commonly in use; JAWS3 (Job Access With Speech), a commercial product designed by the company Freedom Scientific, and NVDA4 (Non-Visual Desktop Access), a free and open-source equivalent.

2.3. Digital Audio Workstations

DAWs are often the centrepiece of a sound creative’s artefact ecology. Various companies manufacture DAWs, and most share several standard capabilities, such as the examples below.
  • Recording, editing and processing of both audio and MIDI data.
  • Supporting time-based transitions through parameter automation.
  • Hosting third-party software instrument and audio effect plugins.
  • Rendering to standard audio file formats.

2.4. Additional Software

Some VIB sound creatives make use of additional software in their workflow. Optical character recognition (OCR) is a method used to scan a GUI, identify any text elements and make them available to the screen reader for description and navigation. OCR can be beneficial in situations where the GUI has not already been made accessible to a screen reader by the software manufacturer. NVDA offers a free OCR addon5. VOCR6 is an open-source VoiceOver OCR extension popular with the Mac users within our participant group.
Some VIB sound creatives use additional scripts with their screen reader and DAW. For example, the script Flo Tools7 is popular among VIB sound creatives using the DAW Pro Tools8 on MacOS. Pro Tools already boasts a significant level of screen-reader functionality9. However, Flo Tools expands the capabilities of Pro Tools, enabling VIB sound creatives to work more efficiently. Some VIB sound creatives consider Flo Tools to be a usability rather than an accessibility extension as all of Pro Tools’ features are accessible without Flo Tools; however, the latter improves the workflow.

2.5. Additional Hardware

The Komplete Kontrol10 range of keyboards produced by Native Instruments supports the accessible manipulation of plugin parameters via TTS feedback, instigated by the operation of hardware controls but delivered via the computer. To this end, Native Instruments developed the Native Kontrol Standard (NKS)11 protocol. Third-party software companies following this protocol can benefit from the same degree of accessibility via the Komplete Kontrol hardware.
Some VIB sound creatives also use braille displays in their accessible workflows. Braille displays are physical devices that manipulate small cells on a tactile interface to simulate braille embossed on paper. These displays can interface with the computer and change the cell state to represent the text currently focused on the screen. Most displays also allow someone to enter text in place of a typical QWERTY keyboard.

2.6. Industry

Several prominent software manufacturers have recently announced commitments to make their technology VIB accessible. A number, such as Native Instruments mentioned above, have released technology featuring components designed specifically with visual accessibility in mind. The industry is taking steps in the right direction, in part thanks to the advocacy of VIB sound creatives; however, we would also like to note that these commercial products require a monetary outlay. A thriving community of open-source community developers offer users alternative tools at zero cost. For example, the OSARA12 accessibility extension commonly used with the DAW Reaper13 is open source, free to use and developed in close communication with the developers of Reaper itself.

2.7. Online Communities

For completeness, it is vital to acknowledge online communities’ role in providing and supporting accessible DAW workflows. Examples of such communities include Reapers without Peepers14, the Pro Tools Accessibility Google group15, the Logic Accessibility Google group16 and the MIDI Mag mailing list17. Amongst other things, these communities are a place for VIB sound creatives to seek technical advice regarding access solutions and music production techniques. Members of these communities benefit from one another’s experience, for example, by checking the degree of accessibility of certain music-technology products before committing to a purchase.

2.8. Haptics for VIB Accessibility

HCI researchers in related fields have proposed haptic technologies to address access barriers that VIB people encounter. Commonly, haptic technologies are considered a means of presenting additional sensory information to a technology user, often to replace the visual modality.
O’Modhrain et al. (2015) provided an overview of existing technologies with the potential to represent visual media through refreshable tactile displays and the kinds of data and media that could be displayed with them. Three common tasks proposed for haptic technologies and accessibility are data representation, navigation and communication.
Typically, numeric data is represented through visual methods, such as graphs; concepts, such as software architecture or algorithms are often presented as flowchart diagrams; and ‘concept maps’ or ‘mind maps’ are common ways of organising and representing related concepts. In discussing approaches to non-visual methods of co-design with visually-impaired users, Metatla et al. (2015) proposed diagram editing as a domain in which haptic devices could potentially address access barriers. Ramloll et al. (2000) proposed a method of blending sonification techniques with haptic technology in order to present line graph data.
Researchers have proposed haptic gloves to communicate information about the environment to VIB people in a wearable and portable format. Zelek et al. (2003) and Kilian et al. (2022) proposed using haptic gloves to support wayfinding and navigation, incorporating cameras and computer vision software to represent the distance and location of objects in the surrounding environment as haptic information. Gloves and wearables have also been explored as aids to social interaction by converting non-verbal communication cues, such as facial expressions, as haptic information (Krishna et al. 2010; McDaniel et al. 2008).
  • Haptics-Based Accessibility in Music Production and Performance
The Haptic Wave (Tanaka and Parkinson 2016) is a prominent example of haptic technology leveraged to support VIB sound creatives’ access to music production software. The Haptic Wave is a custom-built prototype device consisting of a motorised fader mounted on horizontal rails. The hardware couples with a software counterpart. This device allows VIB users to ‘feel’ the amplitude of a waveform as they scrub across the X axis, translating the waveform’s amplitude to the motorised fader’s position on the Y axis.
This design addresses a key issue that arises from the move from analogue studios and their inherent physicality and tangible nature (for example, large mixing consoles, outboard audio processors and reel-to-reel tape) to digital audio workstations, which rely heavily on visual representations of waveforms. This GUI-based representation allows sighted users to identify peaks in the waveform and silences, providing a reference for edit points, such as cutting and splicing between different takes.
For VIB sound creatives, the visual representation of the waveform is not generally available; therefore, edit points are typically found by ‘scrubbing’ through the waveform and listening intently in a linear process. The Haptic Wave was designed over a series of co-design workshops and evaluated in situ as a technology probe (Hutchinson et al. 2003). An interesting approach that the designers took was to map audio information directly to the haptic domain, rather than aim to represent visual information via haptics, to ‘avoid translating paradigms of the visual into the haptic modality’ (Tanaka and Parkinson 2016).
The Moose (O’Modhrain and Gillespie 1997) is an early example of a prototype haptic device targeted towards VIB users of specialist software. O’Modhrain and Gillespie discuss its potential as a device for interacting with DAW software. It comprises a ‘puck’ that moves on a 2D axis, similar to a computer mouse. The puck’s freedom of movement can be affected in two axes using linear motors to simulate force feedback effects and artefacts, such as detents and grooves. In this way, the authors propose a method of representing GUI objects, such as buttons and sliders, via haptic feedback, thereby, allowing the user to navigate DAW software.
The HaptEQ (Karp and Pardo 2017) is a low-cost, open-source, tactile device designed for modifying graphical equaliser (EQ) curves via a tangible interface. Using a metal chain on a magnetic sheet, the user can effectively draw and feel an approximation of the desired EQ curve on a 2D plane. A webcam positioned directly above the chain captures images for computer-vision-based software that translates the shape of the chain’s curve into the parameter settings of an EQ plugin.
The touchEQ (Pesek and De Man 2021) is an application designed for the Surface Haptics TanvasTouch18. The device utilises a phenomenon called electroadhesion, which can simulate the effects of friction and, therefore, different textures, across a smooth touch screen surface. Pesek and De Man describe developing an EQ programme for this device and a subsequent user test with sighted users. While significantly more expensive than the HaptEQ, the touchEQ provides two-way communication between the user and the device, allowing the perception and manipulation of EQ parameters.
Outside the recording studio, haptic devices have addressed access barriers in performance contexts. Where a music performance requires a response to visual cues, for example, an orchestra or choir following a conductor, VIB musicians rely on several methods to work around the lack of visual feedback from the conductor. Pragmatic approaches to this issue can include listening for breathing cues from neighbouring players or relying on a neighbouring player to provide a physical timing cue, such as tapping on the shoulder or foot (Baker et al. 2019).
Many research projects have explored using haptic technologies to communicate information between a conductor and musician. Baker et al. (2019) proposed translating the movements of a conductor’s right hand, via a motion sensor, into a 2D array of vibration motors worn as a vest by the performer. The authors found that some of their participants preferred a simpler vibrating ‘pulse’ pattern to the complex 2D representation of a conductor’s arm movements. As with the Haptic Wave, this observation again highlights the notion that a one-to-one mapping of visual information to haptic may ultimately be less valuable to a VIB person, who may not be working from a visual reference point to begin with.
The Haptic Baton19 also focuses on translating the movements of a conductor’s baton into haptic feedback. The baton features sensors that track movement, which is translated into signals conveyed by two vibration motors worn by the musician, one on each leg. Kawarazaki et al. (2014) described a similar project, which uses a baton with a built-in motion sensor. In their system, the movements of the conductor’s baton are translated to ‘beat’ signals conveyed by haptic devices worn by choir members. They also incorporated a Microsoft Kinect20 sensor to detect the direction the conductor is facing, allowing members of the choir to perceive when they are being given specific instructions from the conductor, such as to prepare to perform one’s part.
From related work, it is clear that haptic technology has great potential to address several access barriers that VIB people face when using music technology. However, we have yet to see a widespread uptake of haptic technology in the domain of VIB sound creatives, for whom access tools that leverage the auditory modality (i.e., screen readers) remain the norm. In the next section, we present an interview study with a group of VIB sound creatives to unpack why they chose the tools they use and how new tools, including those incorporating haptics, may fit in people’s existing workflows.

3. Interview Study

3.1. Design

The study consisted of twenty remote, semi-structured interviews with VIB sound creatives. One common thread connecting each participant is their use of DAW software as a central tool in their creative pursuits. Each interview was two hours in duration. The first three authors conducted the interviews and shared responsibilities for leading them.
We interviewed 20 people in total. Most of our participants were in the 35–44 year age range, with three in the 18–24 year range and one in the 65–74 year range. Two of our participants were female. Most of our participants answered ‘totally blind’, or words to that effect, when asked to describe their visual impairment. However, a few reported ‘low vision’, ‘legally blind’ or ‘registered blind’. Half the participants described themselves as ‘advanced’ when asked to rate their skill or knowledge in using music software, while nine answered ‘intermediate’, and two answered ‘beginner’. None of our participants used Ableton Live, although several mentioned this DAW. Reaper was the most commonly used DAW, followed by Logic Pro and then Pro Tools. Other less common DAWs included Cakewalk Sonar, FL Studio and Samplitude.
Although our research questions are specifically concerned with the experience of VIB people, we acknowledge that some resultant insights may discuss ideas relating more broadly to audio technology. These insights are still valuable, and as they arose naturally from our chosen analysis methodology, we chose to include them in this article. Additionally, our findings do not relate specifically to other disabled groups within the wider sound creative community, and we encourage other researchers to explore these avenues in their future work.
The first hour of each interview focused on getting to know the participant. The authors invited participants to complete a short questionnaire beforehand, which served the practical goal of ascertaining their demographics, location, timezone and choice of music and accessible technology. The interview then delved deeper, attempting to understand the participants’ creative priorities, their views on their current choice of technological tools, accessibility and inclusion in education and the music industry, aspects of music-technology accessibility to be celebrated and points of frustration. As the vast majority of participants were approached via online communities, such as mailing lists, the role of these communities also formed a prevalent topic of conversation.
While the interviews adhered to a typical semi-structured methodology, we stimulated the discussion further halfway through each interview by switching our approach to that of a contextual enquiry. A contextual enquiry is, in essence, an interview in context (Holtzblatt and Beyer 1997). Participants are asked to complete a task related to the domain of interest while interviewers observe and ask questions.
It is helpful to frame the relationship between participant and interviewer as similar to that of master and apprentice. The participant is the master, the interviewer the apprentice; the latter holding the objective of learning the participant’s approach to the task at hand. We asked each participant to work towards a creative goal relevant to their typical creative pursuits. Between participants working as sound designers, audio engineers, music producers, musicians and composers, we observed tasks associated with various goals.
The benefit of a contextual enquiry as opposed to a conventional semi-structured interview is that it exposes aspects of a practical task that may be unarticulated, implicit or taken for granted in a verbal response to a posed question. For instance, when asked how they might typically start a DAW project, a participant may respond by stating that they import an audio file. However, in observing a participant, one can see details of their approach; for example, the use of any filename conventions, whether or not a particular file import dialogue is utilised, and if not, the observer can ask why not. The creative goals of our participants included the following examples:
  • Composing a short section of music.
  • Mixing a selection of pre-recorded audio stems.
  • Designing a sound to reflect a magic spell being cast in a fantasy game.
  • Audio editing to remove unwanted incidental sounds from recorded dialogue.
It is essential to highlight that a contextual enquiry of one hour in duration only provides enough time to focus on a mere snapshot of a sound creative’s workflow. However, this snapshot is rich and deep and arguably maintains a greater ecological validity than a semi-structured interview.
Conducting the study online and remotely was a double-edged sword. This approach enabled the researchers to connect with international participants in their studio environment, increasing diversity with a low monetary outlay. The flip side was that using Zoom video conferencing software somewhat disrupted the (at times) complex audio configuration utilised by participants. Occasionally, it was not possible for participants to use their regular audio interface and new audio configurations introduced latency—the arch-nemesis of the sound creative.

3.2. Analysis

For this article, the authors extracted salient quotes from the interview transcripts deemed relevant to applying haptic technology to address access barriers. We identified discussions of haptic devices already incorporated into a participant’s artefact ecology and how tangible and tactile interfaces can improve the access and ease of task execution.
Furthermore, we captured participants’ perspectives on the opportunities presented by haptics in their practice as sound creatives. Of the twenty interviews conducted, we identified six as being particularly relevant to the topics covered in this article. We omitted the remaining interviews from the thematic analysis; however, the authors will present a full-scale analysis of all interview data in a forthcoming publication.
These quotations formed the source material upon which we performed a reflexive thematic analysis (TA). Braun and Clarke (2021) spearheaded this flavour of TA, arguing that all researchers are inherently situated and subjective. Furthermore, while academics embedded in a positivist research culture may push against subjectivity, it is an asset to be embraced. The reflexive researcher brings value (Ibid.).
Braun and Clarke utilised a six-phase process when performing TA, with phase six amounting to writing a research report. It was this process to which we adhered.

3.2.1. Phase One—Familiarisation

The first phase involves immersion with the dataset, which, in our case, were quotations from interview transcripts. Here, began each author’s immersion and subsequent familiarisation with the dataset. We were fortunate to each attend the six interviews and play an active role in each discussion. The AI-powered natural language processing tool Otter.AI21 provided the initial transcripts of each discussion. However, due to each transcript being laced with minor errors, the authors divided the transcripts between themselves and proofread and corrected each, using the original recordings as a reference. While this was a somewhat arduous task, it contributed significantly to our goal of dataset immersion.
Each author selected quotations from the transcripts they personally edited. These extracts captured points salient to this article’s research domain: the potential of haptic technology within each participant’s ecology of creative practice. For us, salience took precedence over the frequency of ideas expressed.

3.2.2. Phase Two—Coding

This first coding round consisted of assigning pithy tags and highlighting points of interest in the collated quotations. The authors were interested in both semantic and latent meanings in the text. However, the majority of codes highlighted the former.
The open-source web-based TA tool Taguette22 supported the coding process. While this software is constrained in functionality compared to NVivo23, it is largely accessible to screen-reader interaction. This feature is essential as one team member is visually impaired, and collaboration is critical in our work. Another factor is that, by necessity, we worked remotely, a workflow supported by Taguette. We shared a code book that initially captured succinct and somewhat reductive codes, such as ‘access barrier’, ‘community support’ and ‘efficiency is important’. However, these codes saw frequent revision, expanding their detail and specificity. The goal was for these codes to stand alone and reduce the need to return to the original text for explanation.

3.2.3. Phase Three—Generating Initial Themes

We moved to Excel24 spreadsheet software to generate initial themes from the codes. This platform is also largely accessible to screen-reader interaction and supports remote collaboration. Braun and Clarke argued that themes are not lying within a collection of codes awaiting discovery. Instead, they are generated by the researcher(s) as they reflect on the codes, their knowledge and their experience. While we focused on the six interviews we deemed most pertinent to haptic technology, inherently, our experience and insights from participation in the other fourteen interviews also coloured and informed our perspectives. We grouped codes by categories, such as ‘access strategy’, ‘role of industry’ and ‘benefits of haptics’. Some codes appeared in multiple categories. From these categories, candidate themes emerged.

3.2.4. Phase Four—Developing and Reviewing Themes

We spent several hours online discussing, developing and refining themes. At times, the authors returned to the quotations to clarify specific points, a process beneficial in corroborating emergent insights, which ensured that we stayed close to the source material.

3.2.5. Phase Five—Refining, Defining and Naming Themes

This phase was largely an extension of the previous one. However, we also considered the potential overlap between themes and their differences. Fundamentally, we wanted to ensure that the themes depict an accurate and compelling representation of the study participants’ thoughts, opinions and experiences.

3.2.6. Author Statements of Positionality

A key component of reflexive thematic analysis is the acknowledgement of the positionality of the researchers—i.e., the researchers’ experiences, cultural contexts and world views that may inform how we relate to the research and our participants (Holmes 2020). Here, we provide positionality statements for the authors who conducted the interview studies and subsequent analysis.
Jacob Harrison is a middle-class, thirty-one-year-old white British male who identifies as non-disabled. His background is in academic research focusing on accessibility in music making and how the design of new technologies can impact disabled people’s access to music, whether positively or negatively. He also uses DAWs as a recording studio Session Leader with learning-disabled young people in their freelance work.
James Cunningham is a Christian, visually impaired, white, twenty-five-year-old male from Northern Ireland. He is an experienced composer and improviser. At the time of this paper’s publication, he is studying for a PhD at Queen’s University Belfast, combining their artistic practice with an exploration of accessible music technology.
Alex Lucas is a middle-class, forty-year-old, white English male who identifies as non-disabled. Alex has a background in product design in the music-technology industry. His primary role was the design of user interfaces on hardware synthesisers. His academic research focuses on accessibility and inclusion in music-making practices. Several years ago, this author worked as a part-time support worker for Mencap’s children’s services. This experience of working with a diverse range of children shaped Alex’s view on disability, seeing it as highly situated and unique to the individual.
Alex also spent some time as a sound technician working with the now-disbanded Surrey Heath Talking Newspaper for the Blind, recording local news items to be distributed by audio cassette tape. The latter experience helped Alex to appreciate that some cultural forms are unavailable to VIB people and that solutions to such barriers sometimes involve antiquated technology.

4. Discussion of Themes

We identified six themes, each split into sub-themes. Alongside each theme, we present illustrative quotes from the interview transcripts. The quotes are presented verbatim but are edited for brevity and readability in some places. We use our participants’ names without pseudonyms (with permission) and the names of the three first authors who conducted the interviews.
  • Theme 1: In the absence of support from the mainstream music technology industry, online communities have found it necessary to step in to make improvements.
This theme illustrates a common observation among our participants on the role of online communities of VIB sound creatives and the impact of the music-technology industry. Many of our participants are active members of mailing lists, forums and messaging groups relating to access tools and techniques. These online communities are typically DAW-centric, for example, the Reapers without Peeper’s group, whose members share tips and strategies on using Reaper software and contribute to the ongoing open-source Reaper accessibility initiatives, such as the Osara extension. Participation can come in the form of asking/answering questions, providing tutorials and sharing custom scripts to bridge gaps in software access. In the case of Reaper, the developers of the software itself engage with online communities of VIB users in order to improve accessibility and address bugs:
Conor McPhilemy: “So those guys [Reaper developers] have been the most receptive, apparently to, you know, to feedback from the blind community. So they literally have open channels there so bugs are going on all the time from the guys on the Reapers without Peepers list. And they are, they are getting stuff fixed for us, do you know what I mean? All the time, they are, they are fixing stuff for us. They’re being proactive with all their new software, you know, when they are releasing new software and stuff like that, they are thinking about us, they are thinking about accessibility, they are thinking about the way things are laid out and stuff like that. So I think that is one of the reasons why Reaper has been super successful.”
While the presence of these online communities was generally seen by participants as a positive thing, some noted that these communities are required in the absence of full accessibility support from music technology companies:
Justin Macleod: “[Kontakt] is just not accessible and easy to use out of the box. You cannot make patches from scratch in their synths. You cannot make—you cannot sample your own instruments with Kontakt, you get what they spoon feed you and that is it. Furthermore, you cannot you cannot use Native Instruments effectively without a bunch of extra scripts and AutoHotKey things and whatnot. So for example, to use Kontakt to batch re-save content, to get it to load faster, I need a script, which I had to purchase from someone else. Access 4 Music25, they do great stuff, you know, I do not begrudge them the money but it should not be the case.”
  • Theme 1.1: The priorities of the music-technology industry directly impact the range of options available to VIB sound creatives.
Many tools and services in the music-technology industry are not accessible. This situation is possibly due to accessibility being overlooked or deprioritised in development backlogs. For example, with some DAWs now implementing support for Dolby Atmos, one participant implies that software vendors have not considered VIB users.
Joey Stuckey: “I will say, I am very concerned about Dolby Atmos and you know that kind of immersive-audio experience because right now it’s totally inaccessible to the blind and it’s here to stay.”
Furthermore, software that may be considered ancillary to a DAW but an essential part of a professional workflow may feature access barriers.
James Cunningham: “So you’re not able to use these other programs because they do not support screen reader access?”
Justin: “That is certainly true in the case of Basehead and Soundly when I last checked them—I haven’t checked out Soundminer personally, but I am pretty confident that it’s not accessible.”
  • Theme 1.2: Online communities support learning and development.
There are often numerous ways to learn how to use new music technology for sighted people, such as through instruction manuals or video demonstrations. These resources may be inaccessible to VIB sound creatives due to the use of graphical diagrams or incompatibility with screen-reader software. Guides may focus only on a typical workflow for a sighted person (e.g., using the mouse instead of keyboard-based navigation). It is common for VIB sound creatives to look to online communities for tutorials geared towards VIB workflows. Such resources can be presented in accessible formats, for example, via podcasts or by formatting text-based materials to be screen-reader-friendly. Informal help and advice are often sought through these communities, for example, by seeking recommendations on software:
Conor: “[Referring to the Reapers without Peepers mailing list] So if you were looking at a new, you know, a new plugin that you’d seen was on special offer or something like that. You can throw an email up there and say, guys, anybody using this, anybody any thoughts on this? Is everything labelled? Can you get at the presets? Can you get at the parameters and all that sort of stuff? So there is real good feedback there.”
  • Theme 1.3: Screen readers are an imperfect access strategy due to the lack of support from mainstream software companies.
Screen readers rely on relevant information being made available via attributes specified in code by software developers. For example, in web browsing, the correct use of HTML tags allows screen-reader software to correctly parse and navigate the information contained in a website, such as heading structures, hyperlinks and image descriptions. Screen readers require similar information from DAWs. For example, by correctly and appropriately labelling plugin parameters in the source code. If this support is lacking, the quality of information the screen reader relays to the user is negatively affected.
Justin: “Basically, you have some plugin developers, you look at their—so let’s say there’s a slider with a four stage switch for an LFO shape, so it’d be sines or square or triangle. On some plugins, you will see that: the screen reader will represent it as sine, square, triangle. In other plugins, it’ll be just zero to one, there’ll be a slider going from zero to one. So you have to know that you need to bring your slider to 0.25 to make this work, and you have to know what 0.25 is going to be. So there is a lot of things that can trip you up in a plugin.”
  • Theme 1.4: Developments in technology have helped improve access, led by community and industry efforts.
Generally speaking, accessibility continues to improve for VIB sound creatives. These improvements are partly due to technological developments, leading to computers being powerful enough to run DAWs reliably alongside screen readers and hardware capable of supporting TTS feedback. However, improvements in accessibility are not solely a result of technological developments. Efforts from the VIB community and industry have leveraged these developments for accessibility purposes.
Peter Bosher: “At the time, I had a DX726, and an Alesis sequencer, before I even knew Sequencer Plus27, so I knew how to do sequencing. But then I had a PC. And it was obvious that it was much better to use a PC with speech, so that I could, rather than trying to figure out what was going on, on a sequencer where I had to remember the sequences of button presses, and yeah, you know, having speech feedback was was such a huge game changer.”
Peter: “In Pro Tools with speech … you can do this in two different ways. You can have what is called the Inspector where it will speak the level every now and then. And so you can tell roughly where it’s peaking. And you can say ‘I want it to hold for three seconds so that I don’t miss any peaks’. You can go into a particular track. So you need to find the track on the screen and interact with it. It’s all doable, but it takes longer. And Flo Tools has made some of that much quicker. So you can use a keystroke just to see quickly which tracks are selected, which used to take a long time before, just if you weren’t sure which tracks are selected, they will immediately tell you. You can immediately find out your whereabouts on the timeline, how long the selection is, stuff like that, which is super helpful. So Flo Tools has made it even more accessible.”
  • Theme 1.5: The experience of VIB people can provide valuable insight in the design of accessible music-technology tools.
VIB people are experts in their tools, workflows and lived experience of disability. Software developers and hardware designers can follow accessibility standards (for example, the Web Content Accessibility Guidelines28 (WCAG) in the case of website design). However, insight into how well a piece of technology meets access requirements can only come through the inclusion of VIB people in the design and development process. An example of this is the way that Reaper developers regularly engage with their VIB users. It was also common for our participants to mention their willingness to engage with developers to report bugs and feedback and their mixed experiences with how well their requests were met.
  • Theme 2: Access comes through a multitude of factors not limited to technology.
Many of our participants work in broad creative ecosystems. These ecosystems do not merely contain technological artefacts (for example, DAWs, plugin instruments and effects, screen readers, scripts and MIDI controllers—but also include the less tangible online tutorials, documentation and online communities and their collective knowledge experience. Existing within and managing such an ecosystem can often be challenging for VIB sound creatives, especially in times of change, such as major software updates or when introducing new technological artefacts. Some participants voiced frustrations around conflicting or malfunctioning tools, leaving parts of their workflow ineffective.
Peter: “So if you’ve got existing tracks, and you want to stick a particular spot effect at this point, you don’t want to be creating a new track, you want to drag the clip from wherever it is, and put it on your track. And there’s a keystroke to do it in Flo Tools, but it doesn’t work very well. So the first time you do it, it will probably say drag failed or drop failed. And it’s because they’ve had to go through these hoops to simulate the mouse click process, but it doesn’t, it doesn’t work well at all.”
  • Theme 2.1: VIB sound creatives invest time in developing their own access strategies.
VIB sound creatives often encounter numerous accessibility issues within their complex ecosystems. Due to the highly specialised nature of many of these issues and a perceived lack of support from the technology industry, many of our participants demonstrated unique workarounds. However, through online research, community input and trial and error, participants found solutions that they often invested much of their time in creating.
  • Theme 2.2: Hardware and screen readers are complementary methods of interaction for VIB creatives.
Many of our participants’ uniquely curated workflows were highly multimodal, combining auditory and tactile feedback. In particular, we noticed several people using their screen reader in conjunction with a keyboard controller or dedicated control surface, supporting efficiency in their workflow.
Peter: “So normally, the easy way to do that [edit EQ parameters precisely] is with the control surface, which then puts the main bands that I want access to—the high shelf, the low mid band, high mid band—are on pairs of faders. So I have the gain and the frequency on the faders. And that way, it’s much easier to adjust the EQ that way.”
Alex Lucas: “I guess when you’re using that [control surface] you’re not receiving any VoiceOver feedback?”
Peter: “Yes, you are, because Flo Tools has a very nice thing called the plugin inspector, which when you switch it on, it actually speaks to the parameters as they change. So it’s very nice.”
Alex: “Ah, great, and that is kind of optional, you can choose to either have it on or off, I guess?”
Peter: “Yeah.”
We also observed how some participants would deliberately not use their screen reader’s TTS feedback, relying solely on the familiar key commands to manipulate the audio without the speech disturbing concentration.
Conor: “So let’s say dialling in a bit of EQ on a snare or something like that. So you know, if you need to be you need to be super precise, and you’re moving up and down a fader, I have seen Scott, and what you can do in NVDA is turn the speech off. So you’re not turning off NVDA but you’re turning off the speech”.
  • Theme 2.3: A mental map can replace information in the visual domain, acting as a reference point, aiding usability but requiring effort.
Some of our participants demonstrated a reliance on memory when navigating menus or timelines. They remember what they encounter to construct a mental map, which they can reference later. In this way, memory can become an effective tool for increasing usability and efficiency in the workflow.
Jacob Harrison: “Is the information that the screen reader is giving you, is that useful? Are you happy for it to read that out and then wait for it to stop talking?”
Conor: “Yeah, I am because it’s given me the name of the presets if I want to remember. As I know that one—I was on this something smooth, wasn’t it? So yeah, no, it’s useful.”
  • Theme 2.4: Supporting materials need to be accessible to maintain independent learning.
Learning how to use new technology was a common experience described by our participants. They described how online tutorials are often inaccessible and designed visually for sighted, mouse-oriented users. Without accessible learning material, VIB sound creatives often need to rely on instruction manuals provided by the developer of their tool. If these materials are not screen-reader compatible, a person might be forced to sacrifice independence and find sighted assistance or begin using the tool completely unaware of its functionality.
Alex: “What’s the kind of process like learning that gear? Are you sort of able to find accessible resources?”
Trey Culver: “I can read the manuals accessibly, but I have sighted help to help me on the mouse.”
  • Theme 3: All access barriers are inequivalences, but not all inequivalences are access barriers.
It can be argued that one key product of an access barrier is inequality. An access barrier restricts possible options, presenting the VIB sound creative with a user experience that differs to that of their sighted peers; the experience of the two is not equivalent. This problematic seed can grow and cause secondary issues, for instance inequivalences in employment, education, social and creative opportunities.
  • Theme 3.1: VIB sound creatives share common creative and professional goals with sighted users but have different workflows and unequal opportunities.
Several of the study’s participants expressed creative and professional goals that one might consider common to those of sound creatives more broadly, without visual impairment or blindness. For instance, the goal of forging a career as an audio engineer or music producer in a professional studio is typical of sighted sound creatives.
It is apparent that in bringing accessibility to their DAW, VIB sound creatives need to employ a workflow that fundamentally differs from sighted conventions. The screen reader affords an alternative control modality with information articulated through synthesised utterances, accessing features and functionality through QWERTY keyboard-based navigation. It would be too bold to claim that VIB sound creatives are essentially operating a different piece of software. However, the experience is undeniably different to that of their sighted peers. The challenge brought by this discrepancy primarily relates to a need for a shared experience with those outside of the VIB sound creative community.
  • Theme 3.2: There are fewer employment opportunities for VIB sound creatives.
Except for Logic Pro and arguably Pro Tools, DAWs do not typically provide full access to their features and functionality through screen-reader interaction. Access comes through bricolage, a combination of hardware and software; each to plug a particular gap in accessibility. Such tools are not typically installed in professional recording studios. Therefore, VIB sound creatives often cannot simply enter such a facility, roll up their sleeves and engage with the task at hand. Therefore, a lack of appropriate tools in this context has a negative impact on employment opportunities. A degree of out-of-the-box DAW screen-reader compatibility will likely go far in addressing this workplace inequality.
Peter: “You still can’t just rock up at a studio that you’ve not been to before and use the equipment. That would be the dream, and it’s miles away from being anywhere near feasible. And so it’s still very difficult, I think, for a blind person to just get an ordinary job in the mainstream studio.”
  • Theme 3.3: VIB sound creatives may not necessarily be at a disadvantage in sound-based creative practice.
As with many observations, we cannot sit solely on one side of the fence. It is not simply the case that inequivelances are innately negative, leading to barriers in access. Conversely, the VIB DAW workflow may be preferable in some aspects of sound-based creative practice.
Maja Sobiech: “I have heard about sound engineers who purposefully turn (off) their screens or do not look at them to not be distracted.”
As stated by Maja above, sometimes visual stimuli can be distracting. Furthermore, we witnessed incredible efficiency by some VIB sound creatives in completing specific DAW-based tasks, employing an arsenal of keyboard shortcuts and commands.
  • Theme 4: Hardware may bring enhancements to VIB sound creatives.
Commercially available devices incorporating haptic feedback were rare among our participants. However, one participant had experience with the Haptic Wave prototype (Tanaka and Parkinson 2016), while another had previously used a custom device for peak metering, which incorporated a modified braille cell29. Despite haptic technologies featuring less prominently than other tools among our participants, many reported the advantages of hardware in the form of MIDI controllers, control surfaces, analogue mixing desks and modular synthesisers. While many sighted sound creatives also regularly use commercially-available hardware, the specific benefits of such hardware to VIB sound creatives is relevant to the haptic-audio community. Participants implied that analogue hardware aided usability due to the predictability and consistency that 1:1 mapping between control and effect affords, aiding efficient workflows.
Joey: “The other thing about analogue is, I know, the third knob on the left does the same thing every single time. I know every button on my console. So I do not even have to think about that. In the digital domain, that’s not true, you have soft keys that change their function depending on what is going on screen. And if you don’t have access to that information it is a real problem. So that is one of the—that’s the main reason, from an accessibility standpoint, that I like analogue.”
It is worth noting that this particular benefit of hardware can be lost in the digital domain due to the ability for devices to map the same knob, slider or button to a wide range of parameters. Some manufacturers have addressed this issue, including Native Instruments with their Komplete Kontrol range30 of MIDI controller keyboards and Arturia’s equivalent offering, Keylab31.
An additional benefit of hardware that is relevant to both analogue and digital devices is the ability to focus on audio that is the subject of an edit or processing operation, as opposed to audio feedback from the screen reader:
Maja: “For example, if I am able to switch to using a braille display with Pro Tools, I think it will be beneficial for my work because I don’t have to use VoiceOver that much. And I don’t distract my hearing, then it might be useful there.”
A number of our participants also modified their existing hardware, for example, using a Dremel to make a tactile mark on a knob cap or using embossed braille tape on a MIDI keyboard.
Joey: “If they don’t have a marking that shows you where the arrow is a knob, I get someone that can see to take a Dremel or a knife. And you know, it kind of defaces the equipment a little bit, but I make a mark. So I can feel where the arrow’s at so I know where I’m turning.”
Maja: “I made signs in braille on the tape, which I could stick to the different places on the keyboard.”
  • Theme 4.1: Awareness of the potential of haptics for DAW access is varied.
Among our participants, some were more aware than others of haptic technologies and their potential benefits. One participant, in particular, suggested how haptic technology could be utilised for mixing spatial audio, for example, with Dolby Atmos:
Joey: “Then, there’s haptics, which I really believe in… I was a big part of the Haptic Wave study and was a big fan of that… So there are haptics, which I think would benefit the sighted user as much as blind user. I think tactile information is just another way to process things. I think if we could make it something where a sighted user also was excited about it, I think we could get further with it [in regard to making it widely available].”
Joey: “[Discussing the inaccessibility of Dolby Atmos] I have got ideas about that, but none of them are fully formed. However, if we could get haptics involved, where you’re able to have a 3D space, what if you had something like, and you know, this is very primitive… However, what if you had something like a Rubik’s Cube, where you could tell the piece of equipment, okay, block number one is left front, block number four is the fourth speaker to the right overhead. Then, you could move those around, you know, in a physical way.”
Other participants did not make direct reference to haptic technology but discussed methods and tools that could be described as haptic devices, for example, a modified braille cell that could be used for peak metering without the need for auditory feedback:
Peter: “There is a guy who [makes] tactile PPM [Peak Program Meter] meters… [using] existing braille cell technology. However, that could be recreated now, much, much more effectively and cheaply.”
Alex: Is it noisy, mechanically, when you use it?
Peter: “No, not at all. That reminds me that there is like an audio peak meter, which is noisy. And you’d have to, you’d have to use headphones sort of split in order to hear it, but it would still interfere with listening to the actual audio. So I’m not really sold on those.”
  • Theme 4.2: Tactile methods can improve accessibility, learning and communication between sighted and non-sighted people.
Some of our participants described experiences of using tactile methods in their education as a means of conveying visual information. One participant described how their music teacher helped them visualise how a musical score looked to sighted people.
Joey: “My regular guitar teacher, honestly taught me more than I learned in college. He was brilliant. And honestly, the only reason I was able to keep up in college was because he had already taught me that stuff and taught it in a way that made sense to a blind person, with no help with braille or anything like that. It was all mental imagery. So he took a box of sand and drew in it and said ‘I know you can’t see this. I know you can’t read it. But I want you to at least know and understand what the musical staff looks like. And what these quarter notes look like and a whole note looks like so at least if somebody described it to you on a page, you can understand it.’”
There is potential for tactile and tangible objects to communicate concepts often presented visually. Such an approach presents learning and education opportunities for VIB people, as described by our participant but also potentially widens the communication bandwidth between VIB and sighted people.
  • Theme 5: Screen readers are an imperfect access strategy.
Screen readers, which are only one tool in a VIB sound creative’s ecosystem, are only a solution to some access barriers. Screen readers are limited in their functionality. Access barriers can arise when software designers and developers do not consider the screen-reader-interaction modality.
  • Theme 5.1: Screen readers are an imperfect access strategy due to the limitations of the TTS modality.
Screen readers possess one major limitation; they convey information linearly. Screen readers invoke a speech description of most elements they encounter, meaning that it can take a considerable amount of time for a VIB person to parse information required in executing a task. Furthermore, it can be highly challenging to garner information from multiple sources concurrently via speech feedback alone. This issue often leads to VIB preferences for naming conventions that first state the most contextually essential information. Similarly, our participants discussed the problems they encounter when their screen reader is excessively verbose or distracting when working with audio.
Alex: “Yeah, I was wondering about the screen reader, the synthesised voice that is generated by NVDA. I’m thinking about that and listening to that whilst working with audio. Is that ever problematic for you? Do you ever find that there is a conflict between the two?”
Conor: “Yes, aye totally yeah, yeah, that is something you have to get used to as well. So you can obviously split it. So I have NVDA at the minute coming through my cans. When recording, I’m kind of jumping between both. Sometimes if I’m recording guitars, I will move NVDA over to the laptop speaker, so the synthetic voice will come out the laptop speaker, and it will keep my headphones clean. As sometimes if you’re halfway through recording a track, for some reason, NVDA might decide to talk or might decide to tell you that the battery’s low or something like that, you know what I mean. So it is something to be aware of.”
Conor: “You can hear how NVDA, might distract you a wee bit, because you’re hearing my voice, and you’re maybe not hearing it move as it pans over and stuff like that. So I would just, you know, have a listen to it afterwards. And make sure it’s where I want it to be. So it’s quite chatty when you come to that… So what I will do this time—I will stop NVDA and I will show you what I mean, by hitting Control just to stop it talking… I will just shut it up. You know, so I do not need to hear all that. But if I do want to have a listen to see what have I got on this track again, it’ll let you continue on.”
  • Theme 5.2: Screen readers are an imperfect access strategy due to the lack of support from mainstream software companies.
Screen readers are complex tools that require technical skills and an understanding of user requirements to implement effectively in software. Additionally, they are specialised tools, which means that certain manufacturers, designers and developers may need to be made aware of their existence. With comprehensive support of the screen-reader interaction modality in music software, VIB sound creatives can avoid significant access barriers in their creative pursuits. In software platforms with no screen-reader support, some of our participants told us that this would exclude that particular software as a potential candidate for integration into their artefact ecology.
Trey: “In my case, take for example, things like Massive, FM8. Yeah, now, sure, we can tweak a few presets right now and that’s fine. That’s brilliant. It’s more accessible than 10-12 years ago. Can I make a patch from scratch on either of them? No.”
Peter: “Basically they didn’t have the version of Pro Tools that spoke, so I was not able to use Pro Tools directly, then.”
  • Theme 5.3: Vision-centric design can result in screen-reader compatibility being overlooked, resulting in access barriers.
Screen-reader compatibility is overlooked when software is designed solely with visual users in mind. This situation can result in unintuitive navigation (through software parameters inadvertently grouped in unusual ways), missing information (such as parameter names) and, in the case of highly visual interfaces (such as those that depict waveforms graphically), a complete lack of functionality.
Conor: “I think the visual people who have the visual access to it, yes, I think they’re getting through stuff [editing tasks] a lot quicker. It’s designed for them, you know, everything’s designed visually for them, you know, what I mean, the way things are laid out and stuff like that. So it’s up to us to find our, you know, the quickest way for me to do it might necessarily not be the quickest way for people who can see to do it. So I think aye definitely we probably are at a bit of a disadvantage, you know, from design wise and away, you know, they like to make things look pretty and you know, things, they like the graphics and they like stuff to look—whereas for me, that’s all kind of irrelevant, like, Can I, can I just get a simple layout, you know, with my buttons or labels and stuff like that, were things make sense. And I can jump from one side to the other side, back and forth as quick as I can and things like that.”
  • Theme 5.4: Vision-centric design creates a discrepancy between sighted and non-sighted workflows.
When access barriers arise, VIB sound creatives often rely on solutions that radically alter their workflow from the typical sighted approach. These approaches are often shared amongst online communities and widely adopted as functional access strategies. For example, one of our participants described how he separates the screen-reader TTS feedback and the audio from Reaper to different sources.
Justin: “[Sighted people] could do whatever they wanted. Absolutely, yeah, they could load stuff up and load this patch and load that patch and find this instrument and manually import this—and make sure this was integrated with the Komplete Kontrol database and just load up this synth and start from scratch and build the—sighted people can do all of that. We can’t. We dive through the presets in alphabetical order very slowly. And there are 1000s of presets. And then we find what we like, and we fiddle with it. And we hope the controls which we would like to fiddle with are exposed. And maybe they are and that’s great. And then that’s how it goes yeah. There’s so much that sighted people can access with the whole Native Instruments deal, and we can’t.”
  • Theme 6: Accessibility is not a VIB sound creative’s only priority.
Many of our participants discussed having priorities that were not limited to issues of accessibility. There are many reasons behind a VIB sound creative’s choice of tools and workflow. Many of these reasons are common across all music-technology users and not unique to VIB people—for example, financial costs.
A common concern among participants was the ability to work as quickly or efficiently as sighted peers. This concern is a particular issue for professional VIB sound creatives, who are at a disadvantage in employment if they cannot compete with their sighted counterparts.
Peter: “That’s another thing where, however much we’d like to be able to say, ‘okay, we can compete on an equal footing with a sighted sound engineer’, it’s not true, because some of the things that you have to do will simply, they just take longer, they just do. And if you’ve got something that has to be done in two hours. And it’s something a sighted person would be able to do in two hours because they can ‘click click click’ and it’s done—we have to use the workaround.”
  • Theme 6.1: VIB sound creatives do not just want things to be possible; they want things to be usable.
While some of our participants reported being able to use several tools, many highlighted frustration with usability issues, particularly in comparison to sighted people’s experience. For example, tools, such as custom accessibility scripts and VOCR can compensate for several usability flaws. However, these workarounds still require additional effort and place an extra strain on VIB people.
Justin: “If you asked Scott [Chesworth—Reaper accessibility specialist/tutor] how to use plugin, A or B, or C, he would, I believe, be the most likely to be able to make it go to the best possible amount. That’s because he knows all the workarounds, and is best at implementing them. Which doesn’t really mean it’s an accessible thing. It just means he knows how to fight with it, and win. So I don’t know all the workarounds. So it could be that stuff is slightly more doable than I believe, in some cases. But if you don’t know the password to open the door, then the door isn’t going to open. And the point is that for sighted people, there is no password, you load up your plugin, you see what’s on the screen, you click the buttons, you read the manual… [For] blind people interacting in these environments, It’s ‘how much experience have you had with this? How much are you prepared to bang your head against walls? How much are you prepared to trial and error?’”
  • Theme 6.2: VIB sound creatives place value on the use of common tools and workflows.
There are perceived advantages to using the same tools and practices that are popular with sighted sound creatives. The use of common tools allows ease of communication between sighted and non-sighted people, supporting collaboration. It is also important to note that different products come with an attached social capital, for example, Ableton Live software, which is common in electronic music production. While other DAWs may currently be more accessible to VIB people than Live, there is a desire to use this software for its recognisability within a particular genre and the common experiences that this can support:
Trey: “In my subjective experience from production, yes I do feel disadvantaged. I can’t use Ableton for a start. But everyone else is using Ableton. It’s a massive disadvantage in a few ways. Number one, it affects your learning. Number two, it affects your social capital if you turn up and someone goes ‘what are you using? Ableton?.’ And then everyone speaks to you. It’s a good way to break the ice, you know, and it’s a recognised thing.”

5. Discussion

By comparing the themes generated from our interviews with the latest efforts in accessibility and haptic-audio research, we can consider how haptics may play a role in addressing access barriers experienced by VIB sound creatives. It is important to highlight that only one of our participants has used the technology mentioned in our discussion of existing haptics research, and we hypothesise that a general absence of such technology in our findings suggests potentially unexplored opportunities. We now discuss our findings in relation to these opportunities, which could provide a jumping off point for future accessibility related haptic-audio research.

5.1. Which Access Barriers Could Haptic Technology Address?

An obvious advantage of haptics in the VIB accessibility domain is that haptic technologies are inherently tactile and tangible, a common attribute of many access tools and strategies employed by VIB people. The most prominent example of tactile, tangible access is in the use of braille, whether reading braille-embossed text, accessing web content via a braille display, navigating public spaces using braille signage or using braille music to read scores.
Some participants described how teachers incorporated tactile and tangible teaching methods into their education. In contrast, others discussed how hardware, sometimes with modifications, benefited their workflow. Here, we specify some specific areas in which we believe haptic technologies could leverage or support the benefits of tactile and tangible interaction to improve accessibility based on the common access barriers encountered by our participants.

5.1.1. Effect Parameters and Mixing

Some of our participants with experience in analogue recording studios or who currently use external hardware discussed the difference between tangible, physical devices and modern, highly graphics-based software. As vision-centric music software has proliferated, so too have access barriers for VIB sound creatives.
Before music studios were based around a computer running DAW software, analogue equipment provided tactile means of navigating and manipulating channel strips and effects parameters. Some of our participants continue to use analogue equipment, citing this tactility as an advantage over working in the digital domain. Other participants use outboard gear, such as MIDI controllers, audio interfaces and modular synthesisers. The tactility and ease of navigation of some hardware was mentioned as something VIB sound creatives found helpful in their workflows with many examples of modified or DIY hardware being used.
The above observation suggests that the predictability and consistency of hardware is something VIB sound creatives value. Despite this, there is a trend towards hardware devices whose physical design is agnostic to the functions they may be assigned, in order to maintain flexibility in mapping. It is common to see hardware that features potentiometers with no tactile markings, end stops or detents so that they can be mapped to any number of functions within a DAW or plugin.
While this is a valuable feature for sighted users, VIB sound creatives frequently need a way of determining the control’s current state and its associated parameter. VIB sound creatives can value predictability and consistency in the structure and function of interfaces, which is easily lost in generic control surfaces with several modes of operation.
A method for working around the limited tactility, predictability and consistency in some commercial hardware is through TTS support, allowing VIB people to query and determine the current mapping and value of a given fader, knob or button. This approach could extend further using haptic feedback. For example, to simulate detents on a continuous potentiometer or to communicate functionality via ‘haptons’, the use of different haptic textures to delineate different areas of the interface has been explored by Pesek and De Man (2021).
This approach would bypass the auditory modality, allowing the sound creative to focus on the immediate task. Haptic feedback can help delineate between parameter types by using consistent vibration patterns to signify the assignment of parameters related to one another, thereby, indicating the control’s function without TTS feedback.
A hardware project that we believe shows some promise is the Smart Knob, an open-source haptic device developed by Scott Bezek32. The Smart Knob incorporates a brushless motor and magnetic encoder to provide ‘closed-loop torque feedback control, making it possible to dynamically create and adjust the feel of detents and endstops’. The project is not explicitly aimed at music-technology use cases; however, it is possible to imagine a similar implementation of this technology within other hardware.

5.1.2. Peak Metering

Several participants cited peak metering as a common task for which few options provided a DAW experience equivalent to a sighted user. For sighted people, a glance at an on-screen mixer or channel strip is enough to rapidly determine whether a track is clipping and give an oversight of the relative amplitude levels of a multitrack project. There are few options capable of providing this same overview of volume levels in a project without sight. Our participants gave examples of workarounds for this scenario, for example, via DIY devices or by using 32-bit recording to allow for more headroom to avoid clipping during the recording stage.
Incorporating haptic feedback, potentially via haptic displays, might offer VIB people a means of peak metering, which is more equivalent to the ‘glance’ afforded to sighted people. This approach has already been explored to some extent by the Haptic Wave (Tanaka and Parkinson 2016). However, this relies on the user scrubbing across a single waveform, so it does not provide an overview of multiple tracks concurrently.

5.1.3. Spatial Audio

We were intrigued by one of our participant’s comments on the potential of haptic devices to support work with Dolby Atmos, describing a Rubik’s cube-like device, which allows the user to work with spatial audio in a ‘physical way’. Working with spatial audio currently requires a means of visualising a 3D space on-screen in order to pan audio objects. This scenario could be well served by a tactile device capable of communicating spatial information non-visually. Indeed, haptic rendering of 3D objects and environments is a well-explored topic (Basdogan and Srinivasan 2002), and similar techniques are used in specialist areas, such as technology-enhanced education in medical settings (Escobar-Castillejos et al. 2016).

5.2. How Are New Tools Used and Appropriated by VIB Sound Creatives?

While the above suggestions for developments in haptic-based accessibility tools may prove to be exciting or valuable lines of inquiry, there are perhaps more fundamental and potentially more complex questions to be considered first. As accessibility researchers, we are interested in the political and social implications of technology and the practical benefits that new technologies may offer. Prior to developing new tools, we might ask ourselves the following questions:
  • Do haptic technologies represent the values and needs of VIB sound creatives?
  • Should modern DAWs be accessible without the need for additional specialist tools?
  • Does technology explicitly designed for VIB sound creatives risk further siloing of a community who often aim to be considered as equivalent to their sighted peers?
Many of the people that we spoke with are professionals or people with significant experience interacting with DAW software via screen readers and software extensions. The response was mixed when discussing if they felt at a disadvantage compared to their sighted peers. We noted that many VIB sound creatives had developed incredibly efficient workflows with the available tools.
Some participants stated that they worked more effectively than sighted people in certain scenarios, for example, due to their mastery of keyboard shortcuts for rapid navigation and manipulation within DAW software. When we are discussing introducing new technologies to people’s workflows, it is essential to consider that many already have workflows that work well for them. We should be careful to not discard the work done already by people with lived experience of visual impairment or blindness by proposing radically different ways of working as solutions to access barriers.
In contrast, while some of these existing workflows and tools have proven to be highly effective for some people, we also noted that this came after a considerable amount of additional time and effort in learning. In many cases, this effort also involved the development of bespoke tools or paying for custom keyboard scripts and additional software. The current model for DAW accessibility for VIB people appears to be largely based on DIY efforts and open-sourced, community-led software, although music-technology manufacturers are increasingly demonstrating efforts to improve accessibility out of the box. Either way, we suggest that the onus should not necessarily be on VIB people and unpaid, community-led efforts to work around access barriers.
The observations above lead to a tension between acknowledging the existing efforts of VIB people and communities to develop tools and workflows that work for them but also aiming to lessen this burden on VIB people in the future to meet the goal of lowering barriers to access for VIB people who have not yet begun their learning journey with a DAW.
An additional factor when considering the use of new, specialist tools within this community is the potential for groups to be siloed or organised around a particular tool or software. Some participants reported the potential social impacts of not having access to a commonly used tool. These inequivalences are particularly stark when a tool is considered ‘industry standard’, affecting employment prospects or is strongly associated with a particular genre, which could hinder building social connections with people working in the same genre. New tools should consider the importance of compatibility with common tools and workflows as a means of leveraging shared knowledge and supporting social connections around a particular tool.

6. Conclusions

This article combined existing research in the haptic-technology domain with insights provided by VIB sound creatives. These insights reflect our participants’ unique lived experiences and should be considered valuable when considering future design avenues in the AMT domain. As researchers and authors, we acknowledge that we work subjectively and reflexively. However, counter to positivist objections, our subjectivity and reflexivity can add richness and depth to our findings. We both named our participants and quoted them extensively to highlight the relevance and agency of their voices, both individually and collectively.
We identified several access barriers that could be partially or fully overcome through the careful implementation and provision of new haptic prototypes. Whilst these design possibilities present an exciting opportunity for developing new technologies, we urge the community to recognise the importance of the socio-political factors that arose from our thematic analysis. VIB sound creatives often self-organise into internationally supported online communities that emphasise inclusive practice and invest significant time and effort into producing free, open-source access strategies. Preserving these values of inclusivity and availability, as well as the existing collection of accessible VIB workflows, should be at the heart of any design proposals moving forward.

Author Contributions

Conceptualization, J.H., A.L., J.C., A.P.M. and F.S.; methodology, J.H., A.L. and J.C.; formal analysis, J.H., A.L., J.C.; investigation, J.H., A.L. and J.C.; data curation, J.H., A.L. and J.C.; writing—original draft preparation, J.H., A.L. and J.C.; writing—review and editing, J.H., A.L. and J.C.; supervision, A.P.M., F.S.; project administration, A.L., A.P.M. and F.S.; funding acquisition, A.L., A.P.M. and F.S. All authors have read and agreed to the published version of the manuscript.

Funding

The UK Arts and Humanities Research Council funded the research described in this article under grant reference number AH/V011340/1.

Institutional Review Board Statement

The Ethics Committee of the School of Art’s English and Languages at Queen’s University Belfast gave ethical approval to this study on the 15 December 2021.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data available on request. A publically available repository of study data will be made available before the end of the project.

Acknowledgments

The authors would like to take this opportunity to thank all participants involved in the Bridging the Gap project to date.

Conflicts of Interest

The authors declare no conflict of interest.

Notes

1
2
3
4
https://www.nvaccess.org/ (accessed on 26 June 2023)
5
6
https://github.com/chigkim/VOCR (accessed on 26 June 2023)
7
https://flotools.org/ (accessed on 26 June 2023)
8
https://www.avid.com/pro-tools (accessed on 26 June 2023)
9
https://www.stevebaskis.com/ptaccess/ (accessed on 26 June 2023)
10
11
12
https://osara.reaperaccessibility.com/ (accessed on 26 June 2023)
13
https://www.reaper.fm/ (accessed on 26 June 2023)
14
https://groups.io/g/rwp (accessed on 26 June 2023)
15
https://groups.google.com/g/ptaccess (accessed on 26 June 2023)
16
17
https://www.freelists.org/list/midimag (accessed on 26 June 2023)
18
https://tanvas.co/ (accessed on 26 June 2023)
19
20
21
https://otter.ai/ (accessed on 26 June 2023)
22
https://www.taguette.org/ (accessed on 26 June 2023)
23
https://lumivero.com/products/nvivo/ (accessed on 26 June 2023)
24
25
http://access4music.com/en/script/free (accessed on 26 June 2023)
26
27
28
29
Unfortunately no details of this bespoke device can be found online.
30
31
https://www.arturia.com/ranges/lab (accessed on 26 June 2023)
32
https://github.com/scottbez1/smartknob (accessed on 26 June 2023)

References

  1. Baker, David, Ann Fomukong-Boden, and Sian Edwards. 2019. ‘Don’t follow them, look at me!’: Contemplating a haptic digital prototype to bridge the conductor and visually impaired performer. Music Education Research 21: 295–314. [Google Scholar] [CrossRef]
  2. Basdogan, Cagatay, and Mandayam A. Srinivasan. 2002. Haptic rendering in virtual environments. In Handbook of Virtual Environments. Boca Raton: CRC Press, pp. 157–74. [Google Scholar]
  3. Bødker, Susanne, and Clemens Nylandsted Klokmose. 2011. The human–artifact model: An activity theoretical approach to artifact ecologies. Human–Computer Interaction 26: 315–71. [Google Scholar] [CrossRef] [Green Version]
  4. Braun, Virginia, and Victoria Clarke. 2021. Thematic Analysis: A Practical Guide. London: SAGE Publications Ltd. [Google Scholar]
  5. Escobar-Castillejos, David, Julieta Noguez, Luis Neri, Alejandra Magana, and Bedrich Benes. 2016. A review of simulators with haptic devices for medical training. Journal of Medical Systems 40: 1–22. [Google Scholar] [CrossRef] [PubMed]
  6. Holmes, Andrew Gary Darwin. 2020. Researcher Positionality–A Consideration of Its Influence and Place in Qualitative Research–A New Researcher Guide. Shanlax International Journal of Education 8: 1–10. [Google Scholar]
  7. Holtzblatt, Karen, and Hugh Beyer. 1997. Contextual Design: Defining Customer-Centered Systems. San Francisco: Morgan Kaufmann Publishers Inc. [Google Scholar]
  8. Hutchinson, Hilary, Wendy Mackay, Bo Westerlund, Benjamin B. Bederson, Allison Druin, Catherine Plaisant, Michel Beaudouin-Lafon, Stéphane Conversy, Helen Evans, Heiko Hansen, and et al. 2003. Technology probes: Inspiring design for and with families. Paper presented at the SIGCHI Conference on HUMAN Factors in Computing Systems, Fort Lauderdale, FL, USA, April 5–10; pp. 17–24. [Google Scholar]
  9. ISO. 2011. 26800: 2011: Ergonomics—General Approach, Principles and Concepts. Geneva: International Organization for Standardization. [Google Scholar]
  10. Karp, Aaron, and Bryan Pardo. 2017. Hapteq: A collaborative tool for visually impaired audio producers. Paper presented at the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences, London, UK, August 23–26; pp. 1–4. [Google Scholar]
  11. Kawarazaki, Noriyuki, Yuhei Kaneishi, Nobuyuki Saito, and Takashi Asakawa. 2014. A supporting system of choral singing for visually impaired persons using depth image sensor. Journal of Robotics and Mechatronics 26: 735–42. [Google Scholar] [CrossRef]
  12. Kilian, Jakob, Alexander Neugebauer, Lasse Scherffig, and Siegfried Wahl. 2022. The unfolding space glove: A wearable spatio-visual to haptic sensory substitution device for blind people. Sensors 22: 1859. [Google Scholar] [CrossRef] [PubMed]
  13. Krishna, Sreekar, Shantanu Bala, Troy McDaniel, Stephen McGuire, and Sethuraman Panchanathan. 2010. Vibroglove: An assistive technology aid for conveying facial expressions. Paper presented at the CHI’10 Extended Abstracts on Human Factors in Computing Systems, Atlanta, GA, USA, April 10–15; pp. 3637–42. [Google Scholar]
  14. McDaniel, Troy, Sreekar Krishna, Vineeth Balasubramanian, Dirk Colbry, and Sethuraman Panchanathan. 2008. Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. Paper presented at the 2008 IEEE International Workshop on Haptic Audio Visual Environments and Games, Ottawa, ON, Canada, October 18–19; pp. 13–18. [Google Scholar]
  15. Metatla, Oussama, Nick Bryan-Kinns, Tony Stockman, and Fiore Martin. 2015. Designing with and for people living with visual impairments: Audio-tactile mock-ups, audio diaries and participatory prototyping. CoDesign 11: 35–48. [Google Scholar] [CrossRef] [Green Version]
  16. O’Modhrain, M. Sile, and Brent Gillespie. 1997. The moose: A haptic user interface for blind persons. Paper presented at the Third WWW6 Conference, Santa Clara, CA, USA, April 7–11. [Google Scholar]
  17. O’Modhrain, Sile, Nicholas A. Giudice, John A. Gardner, and Gordon E. Legge. 2015. Designing media for visually-impaired users of refreshable touch displays: Possibilities and pitfalls. IEEE Transactions on Haptics 8: 248–57. [Google Scholar] [CrossRef] [PubMed]
  18. Payne, William Christopher, Alex Yixuan Xu, Fabiha Ahmed, Lisa Ye, and Amy Hurst. 2020. How blind and visually impaired composers, producers, and songwriters leverage and adapt music technology. Paper presented at the 22nd International ACM SIGACCESS Conference on Computers and Accessibility, Virtual Event, Greece, October 26–28; pp. 1–12. [Google Scholar]
  19. Pesek, Jakub, and Brecht De Man. 2021. toucheq: An eyes-free audio equalizer for a surface haptic interface. Paper presented at the 150th Audio Engineering Society Convention, Virtual event, May 25–28. [Google Scholar]
  20. Ramloll, Rameshsharma, Wai Yu, Stephen Brewster, Beate Riedel, Mike Burton, and Gisela Dimigen. 2000. Constructing sonified haptic line graphs for the blind student: First steps. Paper presented at the Fourth International ACM Conference on Assistive Technologies, Arlington, VA, USA, November 13–15; pp. 17–25. [Google Scholar]
  21. Tanaka, Atau, and Adam Parkinson. 2016. Haptic wave: A cross-modal interface for visually impaired audio producers. Paper presented at the CHI’16, San Jose, CA, USA, May 7–12; New York: Association for Computing Machinery, pp. 2150–61. [Google Scholar] [CrossRef] [Green Version]
  22. Zelek, John S., Sam Bromley, Daniel Asmar, and David Thompson. 2003. A haptic glove as a tactile-vision sensory substitution for wayfinding. Journal of Visual Impairment & Blindness 97: 621–32. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Harrison, J.; Lucas, A.; Cunningham, J.; McPherson, A.P.; Schroeder, F. Exploring the Opportunities of Haptic Technology in the Practice of Visually Impaired and Blind Sound Creatives. Arts 2023, 12, 154. https://doi.org/10.3390/arts12040154

AMA Style

Harrison J, Lucas A, Cunningham J, McPherson AP, Schroeder F. Exploring the Opportunities of Haptic Technology in the Practice of Visually Impaired and Blind Sound Creatives. Arts. 2023; 12(4):154. https://doi.org/10.3390/arts12040154

Chicago/Turabian Style

Harrison, Jacob, Alex Lucas, James Cunningham, Andrew P. McPherson, and Franziska Schroeder. 2023. "Exploring the Opportunities of Haptic Technology in the Practice of Visually Impaired and Blind Sound Creatives" Arts 12, no. 4: 154. https://doi.org/10.3390/arts12040154

APA Style

Harrison, J., Lucas, A., Cunningham, J., McPherson, A. P., & Schroeder, F. (2023). Exploring the Opportunities of Haptic Technology in the Practice of Visually Impaired and Blind Sound Creatives. Arts, 12(4), 154. https://doi.org/10.3390/arts12040154

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop