Eluding the Physical Constraints in a Nonlinear Interaction Sound Synthesis Model for Gesture Guidance†
AbstractIn this paper, a flexible control strategy for a synthesis model dedicated to nonlinear friction phenomena is proposed. This model enables to synthesize different types of sound sources, such as creaky doors, singing glasses, squeaking wet plates or bowed strings. Based on the perceptual stance that a sound is perceived as the result of an action on an object we propose a genuine source/filter synthesis approach that enables to elude physical constraints induced by the coupling between the interacting objects. This approach makes it possible to independently control and freely combine the action and the object. Different implementations and applications related to computer animation, gesture learning for rehabilitation and expert gestures are presented at the end of this paper. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Thoret, E.; Aramaki, M.; Gondre, C.; Ystad, S.; Kronland-Martinet, R. Eluding the Physical Constraints in a Nonlinear Interaction Sound Synthesis Model for Gesture Guidance. Appl. Sci. 2016, 6, 192.
Thoret E, Aramaki M, Gondre C, Ystad S, Kronland-Martinet R. Eluding the Physical Constraints in a Nonlinear Interaction Sound Synthesis Model for Gesture Guidance. Applied Sciences. 2016; 6(7):192.Chicago/Turabian Style
Thoret, Etienne; Aramaki, Mitsuko; Gondre, Charles; Ystad, Sølvi; Kronland-Martinet, Richard. 2016. "Eluding the Physical Constraints in a Nonlinear Interaction Sound Synthesis Model for Gesture Guidance." Appl. Sci. 6, no. 7: 192.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.