Challenges and Opportunities of Force Feedback in Music
Abstract
:1. Introduction
FF&M
2. Previous Works
2.1. Force-Feedback Devices
2.1.1. 1-DoF Devices
2.1.2. How Many Further DoF Are Necessary for What?
- In percussion instrumental actions, the performer holds the stick at one end while the other is launched in a ballistic gesture toward the target. Rebound force is experienced by the player’s hand, cf. Bouënard et al. (2010), Van Rooyen et al. (2017). Rebound force is transmitted along the stick to the hand, at which point it becomes a torque. Torques play an active role in percussion performance, for instance, influencing the timing of subsequent hits.
- In violin bowing, the performer holds the bow at the ‘frog’, while the hair–string interaction point varies away from the frog throughout a downward stroke. Several works in the literature, e.g., by Nichols (2000), O’Modhrain et al. (2000) and Tache et al. (2012), have tried to simulate bowing interactions, most of the time using three or fewer DoF. In Nichols’s second version of VBow (Nichols 2002), 4-DoF were used. Indeed, as shown by Schoonderwaldt et al. (2007), forces such as bow weight pull the string along orthogonally to the bow, and both the application of pressure on the string by the player and the rotation around the strings to select which string is bowed are exhibited as torques when translated from the bow–hair interaction point along the bow to the player’s hand.
2.1.3. 3-DoF and 6-DoF Devices
2.1.4. Multi-DoF Force-Feedback Devices
2.2. Software Environments
2.2.1. Physical Modelling for Audio–Haptic Synthesis
CORDIS-ANIMA
DIMPLE
Synth-A-Modeler
MIPhysics
ForceHost
2.2.2. Force Feedback for Sample-Based Music Creation
3. Challenge
3.1. Modularity
3.2. Replicability
3.3. Affordability
3.4. Usability
4. Opportunities
- embedding audio and haptic software into hardware modules
- networking multiple modules with distributed control
- authoring with audio-inspired and audio-coupled tools.
4.1. Embedding
4.2. Networking
4.3. Authoring
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Beamish, Timothy, Karon Maclean, and Sidney Fels. 2004. Manipulating music: Multimodal interaction for DJs. Paper presented at CHI’04: CHI 2004 Conference on Human Factors in Computing Systems, Vienna, Austria, April 24–29; New York: Association for Computing Machinery, pp. 327–34. [Google Scholar] [CrossRef]
- Berdahl, Edgar, and Alexandros Kontogeorgakopoulos. 2013. The FireFader: Simple, Open-Source, and Reconfigurable Haptic Force Feedback for Musicians. Computer Music Journal 37: 23–34. [Google Scholar] [CrossRef]
- Berdahl, Edgar, and Julius O. Smith III. 2012. An introduction to the synth-a-modeler compiler: Modular and open-source sound synthesis using physical models. Paper presented at 10th International Linux Audio Conference (LAC-12), Stanford, CA, USA, April 12–15; p. 7. [Google Scholar]
- Berdahl, Edgar, Peter Vasil, and Andrew Pfalz. 2016. Automatic Visualization and Graphical Editing of Virtual Modeling Networks for the Open-Source Synth-A-Modeler Compiler. In Haptics: Perception, Devices, Control, and Applications. Lecture Notes in Computer Science. Edited by Fernando Bello, Hiroyuki Kajimoto and Yon Visell. Cham: Springer International Publishing, pp. 490–500. [Google Scholar] [CrossRef]
- Bouënard, Alexandre, Marcelo M. Wanderley, and Sylvie Gibet. 2010. Gesture Control of Sound Synthesis: Analysis and Classification of Percussion Gestures. Acta Acustica United with Acustica 96: 668–77. [Google Scholar] [CrossRef] [Green Version]
- Burdea, Grigore C., and Philippe Coiffet. 2003. Virtual Reality Technology, 2nd ed. Hoboken: John Wiley & Sons. [Google Scholar]
- Cadoz, Claude, Annie Luciani, and Jean-Loup Florens. 1984. Responsive Input Devices and Sound Synthesis by Simulation of Instrumental Mechanisms: The Cordis System. Computer Music Journal 8: 60–73. [Google Scholar] [CrossRef]
- Cadoz, Claude, Annie Luciani, and Jean-Loup Florens. 1993. CORDIS-ANIMA: A modeling and simulation system for sound and image synthesis: The general formalism. Computer Music Journal 17: 19–29. [Google Scholar] [CrossRef]
- Cadoz, Claude, Annie Luciani, Jean-Loup Florens, and Nicolas Castagné. 2003. ACROE—ICA artistic creation and computer interactive multisensory simulation force feedback gesture transducers. Paper presented at International Conference on New Interfaces for Musical Expression, Montréal, QC, Canada, May 22–24; pp. 235–46. [Google Scholar] [CrossRef]
- Cadoz, Claude, Leszek Lisowski, and Jean-Loup Florens. 1990. A Modular Feedback Keyboard Design. Computer Music Journal 14: 47–56. [Google Scholar] [CrossRef]
- Calegario, Filipe, João Tragtenberg, Christian Frisson, Eduardo Meneses, Joseph Malloch, Vincent Cusson, and Marcelo M. Wanderley. 2021. Documentation and Replicability in the NIME Community. Paper presented at International Conference on New Interfaces for Musical Expression, Online, June 14–18. [Google Scholar]
- Calegario, Filipe, Marcelo M. Wanderley, João Tragtenberg, Johnty Wang, John Sullivan, Eduardo Meneses, Ivan Franco, Mathias Kirkegaard, Mathias Bredholt, and Josh Rohs. 2020. Probatio 1.0: Collaborative development of a toolkit for functional DMI prototypes. Paper presented at International Conference on New Interfaces for Musical Expression, Birmingham, UK, June 25–27. [Google Scholar]
- Chu, Lonny L. 2002. Using Haptics for Digital Audio Navigation. Paper presented at International Computer Music Conference, Gothenburg, Sweden, September 16–21; pp. 118–21. [Google Scholar]
- Conti, François, Federico Barbagli, Dan Morris, and Christopher Sewell. 2005. CHAI: An open-source library for the rapid development of haptic scenes. Paper presented at IEEE World Haptics, Pisa, Italy, March 18–20. [Google Scholar]
- Covaci, Alexandra, Longhao Zou, Irina Tal, Gabriel-Miro Muntean, and Gheorghita Ghinea. 2018. Is Multimedia Multisensorial?—A Review of Mulsemedia Systems. ACM Computing Surveys 51: 1–35. [Google Scholar] [CrossRef] [Green Version]
- Erkut, Cumhur, Antti Jylhä, Matti Karjalainen, and M. Ercan Altinsoy. 2008. Audio-Tactile Interaction at the Nodes of a Block-based Physical Sound Synthesis Model. Paper presented at 3rd International Haptic and Auditory Interaction Design Workshop (HAID), Jyväskylä, Finland, September 15–16; pp. 25–26. [Google Scholar]
- Florens, Jean-Loup. 1978. Coupleur Gestuel Retroactif pour la Commande et le Contrôle de Sons Synthétisés en Temps-réel. Ph.D. thesis, Institut National Polytechnique de Grenoble, Grenoble, France. [Google Scholar]
- Florens, Jean-Loup, Annie Luciani, Claude Cadoz, and Nicolas Castagné. 2004. ERGOS: Multi-degrees of Freedom and Versatile Force-Feedback Panoply. Paper presented at EuroHaptics, Munich, Germany, June 5–7. [Google Scholar]
- Frisson, Christian, Benoît Macq, Stéphane Dupont, Xavier Siebert, Damien Tardieu, and Thierry Dutoit. 2010. DeviceCycle: Rapid and Reusable Prototyping of Gestural Interfaces, Applied to Audio Browsing by Similarity. Paper presented at 2010 Conference on New Interfaces for Musical Expression (NIME 2010), Sydney, Australia, June 15–18. [Google Scholar]
- Frisson, Christian, François Rocca, Stéphane Dupont, Thierry Dutoit, Damien Grobet, Rudi Giot, Mohammed El Brouzi, Samir Bouaziz, Willy Yvart, and Sylvie Merviel. 2014. Tangible needle, digital haystack: Tangible interfaces for reusing media content organized by similarity. Paper presented at 8th International Conference on Tangible, Embedded and Embodied Interaction, TEI’14, Munich, Germany, February 16–19; New York: Association for Computing Machinery, pp. 37–38. [Google Scholar] [CrossRef] [Green Version]
- Frisson, Christian, Gauthier Keyaerts, Fabien Grisard, Stéphane Dupont, Thierry Ravet, François Zajéga, Laura Colmenares Guerra, Todor Todoroff, and Thierry Dutoit. 2013. MashtaCycle: On-Stage Improvised Audio Collage by Content-Based Similarity and Gesture Recognition. In Intelligent Technologies for Interactive Entertainment. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. Edited by Matei Mancas, Nicolas d’Alessandro, Xavier Siebert, Bernard Gosselin, Carlos Valderrama and Thierry Dutoit. Berlin/Heidelberg: Springer International Publishing, pp. 114–23. [Google Scholar] [CrossRef]
- Frisson, Christian, Mathias Kirkegaard, Thomas Pietrzak, and Marcelo M. Wanderley. 2022. ForceHost: An open-source toolchain for generating firmware embedding the authoring and rendering of audio and force-feedback haptics. Paper presented at 22nd Conference on New Interfaces for Musical Expression, NIME’22, Online, June 28–July 1. [Google Scholar]
- Frisson, Christian. 2013. Designing tangible/free-form applications for navigation in audio/visual collections (by content-based similarity). Paper presented at 7th International Conference on Tangible, Embedded and Embodied Interaction, TEI’13, Barcelona, Spain, February 10–13; New York: Association for Computing Machinery, pp. 345–46. [Google Scholar] [CrossRef]
- Frisson, Christian. 2015. Designing Interaction for Browsing Media Collections (by Similarity). Ph.D. thesis, Université de Mons, Mons, Belgium. [Google Scholar]
- Ghinea, Gheorghita, Christian Timmerer, Weisi Lin, and Stephen R. Gulliver. 2014. Mulsemedia: State of the Art, Perspectives, and Challenges. ACM Transactions on Multimedia Computing, Communications, and Applications 11: 17:1–17:23. [Google Scholar] [CrossRef] [Green Version]
- Gillespie, Brent. 1992. Touch Back Keyboard. Paper presented at International Computer Music Conference, San Jose, CA, USA, October 14–18; pp. 447–47. [Google Scholar]
- Halata, Zdenek, and Klaus I. Baumann. 2008. Human Haptic Perception: Basics and Applications. Chapter Anatomy of Receptors. Basel: Birkhäuser Verlag, pp. 85–92. [Google Scholar]
- Kaltenbrunner, Martin, and Florian Echtler. 2018. The TUIO 2.0 Protocol: An Abstraction Framework for Tangible Interactive Surfaces. Paper presented at ACM on Human-Computer Interaction 2 (EICS), Paris, France, June 19–22; pp. 1–35. [Google Scholar] [CrossRef]
- Kaltenbrunner, Martin, Till Bovermann, Ross Bencina, and Enrico Costanza. 2005. TUIO: A Protocol for Table-Top Tangible User Interfaces. Paper presented at 6th International Workshop on Gesture in Human-Computer Interaction and Simulation, Berder Island, France, May 18–20. [Google Scholar]
- Kirkegaard, Mathias, Mathias Bredholt, Christian Frisson, and Marcelo M. Wanderley. 2020. TorqueTuner: A Self Contained Module for Designing Rotary Haptic Force Feedback for Digital Musical Instruments. Paper presented at International Conference on New Interfaces for Musical Expression, Birmingham, UK, July 25–27; pp. 273–78. [Google Scholar] [CrossRef]
- Leonard, James, and Jérôme Villeneuve. 2019. Fast Audio-Haptic Prototyping with Mass-interaction Physics. Paper presented at International Workshop on Haptic and Audio Interaction Design (HAID2019), Lille, France, March 13–15. [Google Scholar]
- Leonard, James, and Jérôme Villeneuve. 2020. Design and Implementation of Real-time Physically-based Virtual Musical Instruments: A balancing act. Paper presented at Sound and Music Computing Conference, Torino, Italy, June 24–26. [Google Scholar]
- Leonard, James, Jérôme Villeneuve, and Alexandros Kontogeorgakopoulos. 2020. Multisensory instrumental dynamics as an emergent paradigm for digital musical creation. Journal on Multimodal User Interfaces 14: 235–53. [Google Scholar] [CrossRef]
- Leonard, James, Jérôme Villeneuve, Romain Michon, Yann Orlarey, and Stéphane Letz. 2019. Formalizing Mass-Interaction Physical Modeling in Faust. Paper presented at 17th Linux Audio Conference, LAC, Stanford, CA, USA, March 23–26; p. 7. [Google Scholar]
- Leonard, James, Nicolas Castagné, Claude Cadoz, and Annie Luciani. 2018. The MSCI Platform: A Framework for the Design and Simulation of Multisensory Virtual Musical Instruments. Cham: Springer International Publishing, pp. 151–69. [Google Scholar]
- Malloch, Joseph, Stephen Sinclair, and Marcelo M. Wanderley. 2013. Libmapper: A library for connecting things. Paper presented at ACM CHI’13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, April 27–May 2; pp. 3087–90. [Google Scholar] [CrossRef]
- Mattos, Douglas Paulo De, Débora C. Muchaluat-Saade, and Gheorghita Ghinea. 2021. Beyond Multimedia Authoring: On the Need for Mulsemedia Authoring Tools. ACM Computing Surveys 54: 150:1–150:31. [Google Scholar] [CrossRef]
- Morreale, Fabio, and Andrew McPherson. 2017. Design for Longevity: Ongoing Use of Instruments from NIME 2010-14. Paper presented at International Conference on New Interfaces for Musical Expression, Copenhagen, Denmark, May 15–18. [Google Scholar]
- Nichols, Charles. 2000. The vBow: A Haptic Musical Controller Human-Computer Interface. Paper presented at International Computer Music Conference, Berlin, Germany, August 27–September 1. [Google Scholar]
- Nichols, Charles. 2002. The vbow: Development of a virtual violin bow haptic human-computer interface. Paper presented at International Conference on New Interfaces for Musical Expression, Dublin, Ireland, May 24–26; pp. 133–36. [Google Scholar]
- Niyonsenga, Albert-Ngabo, Christian Frisson, and Marcelo M. Wanderley. 2022. TorqueTuner: A Case Study for Sustainable Haptic Development. Paper presented at 11th International Workshop on Haptic and Audio Interaction Design, HAID’22, London, UK, August 25–26. [Google Scholar]
- O’Modhrain, M. Sile, Stefania Serafin, Chris Chafe, and Julius O. Smith III. 2000. Qualitative and Quantitive Assessment of a Virtual Bowed String Instrument. Paper presented at International Computer Music Conference, Berlin, Germany, August 27–September 1. [Google Scholar]
- Oboe, Roberto. 2006. A Multi-Instrument, Force-Feedback Keyboard. Computer Music Journal 30: 38–52. [Google Scholar] [CrossRef] [Green Version]
- Papetti, Stefano, and Charalampos Saitis, eds. 2018. Musical Haptics. Berlin/Heidelberg: Springer Open. [Google Scholar]
- Puckette, Miller S. 1997. Pure Data. Paper presented at International Computer Music Conference (ICMC), Thessaloniki, Greece, September 25–30. [Google Scholar]
- Rahman, H. A., T. P. Hua, R. Yap, C. F. Yeong, and E. L. M. Su. 2012. One degree-of-freedom haptic device. Procedia Engineering 41: 326–32. [Google Scholar] [CrossRef] [Green Version]
- Saleme, Estêvão B., Alexandra Covaci, Gebremariam Mesfin, Celso A. S. Santos, and Gheorghita Ghinea. 2019. Mulsemedia DIY: A Survey of Devices and a Tutorial for Building Your Own Mulsemedia Environment. ACM Computing Surveys 52: 58:1–58:29. [Google Scholar] [CrossRef] [Green Version]
- Schneider, Oliver, Karon MacLean, Colin Swindells, and Kellogg Booth. 2017. Haptic experience design: What hapticians do and where they need help. International Journal of Human-Computer Studies 107: 5–21. [Google Scholar] [CrossRef]
- Schoonderwaldt, Erwin, Stephen Sinclair, and Marcelo M. Wanderley. 2007. Why Do We Need 5-DOF Force Feedback. Paper presented at 4th International Conference on Enactive Interfaces (ENACTIVE’07), Grenoble, France, November 19–22; pp. 397–400. [Google Scholar]
- Seifi, Hasti, Farimah Fazlollahi, Michael Oppermann, John Andrew Sastrillo, Jessica Ip, Ashutosh Agrawal, Gunhyuk Park, Katherine J. Kuchenbecker, and Karon E. MacLean. 2019. Haptipedia: Accelerating Haptic Device Discovery to Support Interaction & Engineering Design. Paper presented at 2019 CHI Conference on Human Factors in Computing Systems, Scotland, UK, May 4–9. [Google Scholar]
- Seifi, Hasti, Matthew Chun, Colin Gallacher, Oliver Stirling Schneider, and Karon E. MacLean. 2020. How Do Novice Hapticians Design? A Case Study in Creating Haptic Learning Environments. IEEE Transactions on Haptics 13: 791–805. [Google Scholar] [CrossRef] [PubMed]
- Sheffield, Eric, Edgar Berdahl, and Andrew Pfalz. 2016. The Haptic Capstans: Rotational Force Feedback for Music using a FireFader Derivative Device. Paper presented at International Conference on New Interfaces for Musical Expression, Brisbane, Australia, July 11–15; pp. 1–2. [Google Scholar]
- Sinclair, Stephen, and Marcelo M. Wanderley. 2008. A Run-time Programmable Simulator to Enable Multimodal Interaction with Rigid Body Systems. Interacting with Computers 21: 54–63. [Google Scholar] [CrossRef]
- Sinclair, Stephen, Gary Scavone, and Marcelo M. Wanderley. 2009. Audio-Haptic Interaction with the Digital Waveguide Bowed String. Paper presented at International Computer Music Conference, Montreal, QC, Canada, August 16–21; pp. 275–78. [Google Scholar]
- Tache, Olivier, Stephen Sinclair, Jean-Loup Florens, and Marcelo M. Wanderley. 2012. Exploring Audio and Tactile Qualities of Instrumentality with Bowed String Simulations. Paper presented at International Conference on New Interfaces for Musical Expression, Ann Arbor, MI, USA, May 21–23. [Google Scholar]
- Van Rooyen, Robert, Andrew Schloss, and George Tzanetakis. 2017. Voice coil actuators for percussion robotics. Paper presented at International Conference on New Interfaces for Musical Expression, Copenhagen, Denmark, May 15–18; pp. 1–6. [Google Scholar]
- Verplank, Bill, and Francesco Georg. 2011. Can Haptics Make New Music?—Fader and Plank Demos. Paper presented at International Conference on New Interfaces for Musical Expression, Oslo, Norway, May 30–June 1; pp. 539–40. [Google Scholar]
- Verplank, Bill, Michael Gurevich, and Max Mathews. 2002. THE PLANK: Designing a Simple Haptic Controller. Paper presented at International Conference on New Interfaces for Musical Expression, Dublin, Ireland, May 24–26; pp. 177–80. [Google Scholar]
- Villeneuve, Jérôme, Claude Cadoz, and Nicolas Castagné. 2015. Visual Representation in GENESIS as a tool for Physical Modeling, Sound Synthesis and Musical Composition. Paper presented at New Interfaces for Musical Expression, NIME, Baton Rouge, LA, USA, May 31–June 3; pp. 195–200. [Google Scholar] [CrossRef]
- Wang, Johnty, Joseph Malloch, Stephen Sinclair, Jonathan Wilansky, Aaron Krajeski, and Marcelo M. Wanderley. 2019. Webmapper: A Tool for Visualizing and Manipulating Mappings in Digital Musical Instruments. Paper presented at 14th International Conference on Computer Music Multidisciplinary Research (CMMR), Marseille, France, October 14–18. [Google Scholar]
- Wright, Matthew, and Adrian Freed. 1997. Open SoundControl: A New Protocol for Communicating with Sound Synthesizers. Paper presented at International Computer Music Conference, Thessaloniki, Greece, September 25–30; pp. 101–4. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Frisson, C.; Wanderley, M.M. Challenges and Opportunities of Force Feedback in Music. Arts 2023, 12, 147. https://doi.org/10.3390/arts12040147
Frisson C, Wanderley MM. Challenges and Opportunities of Force Feedback in Music. Arts. 2023; 12(4):147. https://doi.org/10.3390/arts12040147
Chicago/Turabian StyleFrisson, Christian, and Marcelo M. Wanderley. 2023. "Challenges and Opportunities of Force Feedback in Music" Arts 12, no. 4: 147. https://doi.org/10.3390/arts12040147
APA StyleFrisson, C., & Wanderley, M. M. (2023). Challenges and Opportunities of Force Feedback in Music. Arts, 12(4), 147. https://doi.org/10.3390/arts12040147