- freely available
Appl. Sci. 2018, 8(1), 135; https://doi.org/10.3390/app8010135
- How do people present melodic contour as shape?
- How important is vertical movement in the representation of melodic contour in sound-tracing?
- Are there any differences in sound-tracings between genders, age groups and levels of musical experience?
- How can we understand the metaphors in sound-tracings and quantify them from the data obtained?
2.1. Melody, Prosody and Memory
2.2. Melodic Contour
2.3. Analyzing Melodic Contour
2.4. Pitch and the Vertical Dimension
There is a problem of the musical referents of the terms. As metaphoric depictions, most of these terms are more closely related to the visual and graphic representations of music than to its acoustical and auditory characteristics. Indeed, word-list typologies of melodic contour are frequently accompanied by ‘explanatory’ graphics.
2.5. Embodiment and Music
3. Sound-Tracing of Melodic Phrases
4.1. Feature Selection from Motion Capture Data
4.2. Feature Selection from Melodic Phrases.
5. Analysis of Overall Trends
5.1. General Motion Contours
5.2. Relationship between Vertical Movement and Melodic Pitch Distribution
5.3. Direction Differences
5.4. Individual Subject Differences
5.5. Social Box
5.6. An Imagined Canvas
6. Mapping Strategies
- One outstretched hand, changing the height of the palm
- Two hands stretching or compressing an “object”
- Two hands symmetrically moving away from the center of the body in the horizontal plane
- Two hands moving together to represent holding and manipulating an object
- Two hands drawing arcs along an imaginary circle
- Two hands following each other in a percussive pattern
- Feature selection: Segment the motion capture data into a six-column feature vector containing the (x,y,z) coordinates of the right palm and the left palm, respectively.
- Calculate quantity of motion (QoM): Calculate the average of the vector magnitude for each sample.
- Segmentation: Trim data using a sliding window of 1100 samples in size. This corresponds to 5.5 s, to accommodate the average duration of 4.5 s of the melodic phrases. The hop size for the windows is 10 samples, to obtain a large set of windowed segments. The segments that have the maximum mean values are then separated out to get a set of sound-tracings.
- Feature analysis: Calculate features from Table 4 for each segment.
- Thresholding: Minimize the six numerical criteria by thresholding the segments based on two-times the standard deviation for each of the computed features.
- Labeling and separation: Obtain tracings that can be classified as dominantly belonging to one of the six strategy types.
- There is a clear arch shape when looking at the averages of the motion capture data, regardless of the general shape of the melody itself. This may support the idea of a motor constraint hypothesis that has been used to explain the similar arch-like shape of sung melodies.
- The subjects chose between different strategies in their sound-tracings. We have qualitatively identified six such strategies and have created a set of heuristics to quantify and test their reliability.
- There is a clear gender difference for some of the strategies. This was most evident for Strategy 4 (representing small objects), which women performed more than men.
- The ‘obscure’ strategy of representing melodies in terms of a small object, as is typical in Hindustani music, was also found in participants who had no or little exposure to this musical genre.
- The data show a tendency of moving within a shared ‘social box’. This may be thought of as an invisible space that people constrain their movements to, even without any exposure to the other participants’ tracings. In future studies, it would be interesting to explore how constant such a space is, for example by comparing multiple recordings of the same participants over a period of time.
Conflicts of Interest
- Kennedy, M.; Rutherford-Johnson, T.; Kennedy, J. The Oxford Dictionary of Music, 6th ed.; Oxford University Press: Oxford, UK, 2013; Available online: http://www.oxfordreference.com/view/10.1093/acref/9780199578108.001.0001/acref-9780199578108 (accessed on 16 January 2018).
- Godøy, R.I.; Haga, E.; Jensenius, A.R. Playing “Air Instruments”: Mimicry of Sound-Producing Gestures by Novices and Experts; International Gesture Workshop; Springer: Berlin/Heidelberg, Germany, 2005; pp. 256–267. [Google Scholar]
- Kendon, A. Gesture: Visible Action as Utterance; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
- Clayton, M.; Sager, R.; Will, U. In time with the music: The concept of entrainment and its significance for ethnomusicology. In European Meetings in Ethnomusicology; ESEM Counterpoint 1; Romanian Society for Ethnomusicology: Bucharest, Romania, 2005; Volume 11, pp. 1–82. [Google Scholar]
- Zbikowski, L.M. Musical gesture and musical grammar: A cognitive approach. In New Perspectives on Music and Gesture; Ashgate Publishing Ltd.: Farnham, UK, 2011; pp. 83–98. [Google Scholar]
- Lakoff, G.; Johnson, M. Conceptual metaphor in everyday language. J. Philos. 1980, 77, 453–486. [Google Scholar] [CrossRef]
- Zbikowski, L.M. Metaphor and music theory: Reflections from cognitive science. Music Theory Online 1998, 4, 1–8. [Google Scholar]
- Shayan, S.; Ozturk, O.; Sicoli, M.A. The thickness of pitch: Crossmodal metaphors in Farsi, Turkish, and Zapotec. Sens. Soc. 2011, 6, 96–105. [Google Scholar] [CrossRef]
- Eitan, Z.; Schupak, A.; Gotler, A.; Marks, L.E. Lower pitch is larger, yet falling pitches shrink. Exp. Psychol. 2014, 61, 273–284. [Google Scholar] [CrossRef] [PubMed]
- Eitan, Z.; Timmers, R. Beethoven’s last piano sonata and those who follow crocodiles: Cross-domain mappings of auditory pitch in a musical context. Cognition 2010, 114, 405–422. [Google Scholar] [CrossRef] [PubMed]
- Rusconi, E.; Kwan, B.; Giordano, B.L.; Umilta, C.; Butterworth, B. Spatial representation of pitch height: The SMARC effect. Cognition 2006, 99, 113–129. [Google Scholar] [CrossRef] [PubMed]
- Huron, D. The melodic arch in Western folksongs. Comput. Musicol. 1996, 10, 3–23. [Google Scholar]
- Fedorenko, E.; Patel, A.; Casasanto, D.; Winawer, J.; Gibson, E. Structural integration in language and music: Evidence for a shared system. Mem. Cognit. 2009, 37, 1–9. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Patel, A.D. Music, Language, and the Brain; Oxford University Press: New York, NY, USA, 2010. [Google Scholar]
- Adami, A.G.; Mihaescu, R.; Reynolds, D.A.; Godfrey, J.J. Modeling prosodic dynamics for speaker recognition. In Proceedings of the 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP’03), Hong Kong, China, 6–10 April 2003; Volume 4. [Google Scholar]
- Dowling, W.J. Scale and contour: Two components of a theory of memory for melodies. Psychol. Rev. 1978, 85, 341–354. [Google Scholar] [CrossRef]
- Dowling, W.J. Recognition of melodic transformations: Inversion, retrograde, and retrograde inversion. Percept. Psychophys. 1972, 12, 417–421. [Google Scholar] [CrossRef]
- Schindler, A.; Herdener, M.; Bartels, A. Coding of melodic gestalt in human auditory cortex. Cereb. Cortex 2012, 23, 2987–2993. [Google Scholar] [CrossRef] [PubMed]
- Trehub, S.E.; Becker, J.; Morley, I. Cross-cultural perspectives on music and musicality. Philos. Trans. R. Soc. Lond. B Biol. Sci. 2015, 370, 20140096. [Google Scholar] [CrossRef] [PubMed]
- Trehub, S.E.; Bull, D.; Thorpe, L.A. Infants’ perception of melodies: The role of melodic contour. Child Dev. 1984, 55, 821–830. [Google Scholar] [CrossRef] [PubMed]
- Trehub, S.E.; Thorpe, L.A.; Morrongiello, B.A. Infants’ perception of melodies: Changes in a single tone. Infant Behav. Dev. 1985, 8, 213–223. [Google Scholar] [CrossRef]
- Morrongiello, B.A.; Trehub, S.E.; Thorpe, L.A.; Capodilupo, S. Children’s perception of melodies: The role of contour, frequency, and rate of presentation. J. Exp. Child Psychol. 1985, 40, 279–292. [Google Scholar] [CrossRef]
- Adams, C.R. Melodic contour typology. Ethnomusicology 1976, 20, 179–215. [Google Scholar] [CrossRef]
- Tierney, A.T.; Russo, F.A.; Patel, A.D. The motor origins of human and avian song structure. Proc. Natl. Acad. Sci. USA 2011, 108, 15510–15515. [Google Scholar] [CrossRef] [PubMed]
- Díaz, M.A.R.; García, C.A.R.; Robles, L.C.A.; Altamirano, J.E.X.; Mendoza, A.V. Automatic infant cry analysis for the identification of qualitative features to help opportune diagnosis. Biomed. Signal Process. Control 2012, 7, 43–49. [Google Scholar] [CrossRef]
- Parsons, D. The Directory of Tunes and Musical Themes; S. Brown: Cambridge, UK, 1975. [Google Scholar]
- Quinn, I. The combinatorial model of pitch contour. Music Percept. Interdiscip. J. 1999, 16, 439–456. [Google Scholar] [CrossRef]
- Narmour, E. The Analysis and Cognition of Melodic Complexity: The Implication-Realization Model; University of Chicago Press: Chicago, IL, USA, 1992. [Google Scholar]
- Marvin, E.W. A Generalized Theory of Musical Contour: Its Application to Melodic and Rhythmic Analysis of Non-Tonal Music and Its Perceptual and Pedagogical Implications. Ph.D. Thesis, University of Rochester, Rocheste, NY, USA, 1988. [Google Scholar]
- Schmuckler, M.A. Testing models of melodic contour similarity. Music Percept. Interdiscip. J. 1999, 16, 295–326. [Google Scholar] [CrossRef]
- Schmuckler, M.A. Melodic contour similarity using folk melodies. Music Percept. Interdiscip. J. 2010, 28, 169–194. [Google Scholar] [CrossRef]
- Schmuckler, M.A. Expectation in music: Investigation of melodic and harmonic processes. Music Percept. Interdiscip. J. 1989, 7, 109–149. [Google Scholar] [CrossRef]
- Eerola, T.; Himberg, T.; Toiviainen, P.; Louhivuori, J. Perceived complexity of western and African folk melodies by western and African listeners. Psychol. Music 2006, 34, 337–371. [Google Scholar] [CrossRef]
- Eerola, T.; Bregman, M. Melodic and contextual similarity of folk song phrases. Musicae Sci. 2007, 11, 211–233. [Google Scholar] [CrossRef]
- Eerola, T. Are the emotions expressed in music genre-specific? An audio-based evaluation of datasets spanning classical, film, pop and mixed genres. J. New Music Res. 2011, 40, 349–366. [Google Scholar] [CrossRef]
- Salamon, J.; Peeters, G.; Röbel, A. Statistical Characterisation of Melodic Pitch Contours and its Application for Melody Extraction. In Proceedings of the ISMIR, Porto, Portugal, 8–12 October 2012; pp. 187–192. [Google Scholar]
- Bittner, R.M.; Salamon, J.; Bosch, J.J.; Bello, J.P. Pitch Contours as a Mid-Level Representation for Music Informatics. In Proceedings of the Audio Engineering Society Conference: 2017 AES International Conference on Semantic Audio, Erlangen, Germany, 22–24 June 2017. [Google Scholar]
- Rao, P.; Ross, J.C.; Ganguli, K.K.; Pandit, V.; Ishwar, V.; Bellur, A.; Murthy, H.A. Classification of melodic motifs in raga music with time-series matching. J. New Music Res. 2014, 43, 115–131. [Google Scholar] [CrossRef]
- Gómez, E.; Bonada, J. Towards computer-assisted flamenco transcription: An experimental comparison of automatic transcription algorithms as applied to a cappella singing. Comput. Music J. 2013, 37, 73–90. [Google Scholar] [CrossRef]
- Walker, P.; Bremner, J.G.; Mason, U.; Spring, J.; Mattock, K.; Slater, A.; Johnson, S.P. Preverbal infants are sensitive to cross-sensory correspondences much ado about the null results of Lewkowicz and Minar (2014). Psychol. Sci. 2014, 25, 835–836. [Google Scholar] [CrossRef] [PubMed]
- Timmers, R.; Li, S. Representation of pitch in horizontal space and its dependence on musical and instrumental experience. Psychomusicol. Music Mind Brain 2016, 26, 139–148. [Google Scholar] [CrossRef]
- Leman, M. Embodied Music Cognition and Mediation Technology; MIT Press: Cambridge, MA, USA, 2008. [Google Scholar]
- Godøy, R.I. Motor-mimetic music cognition. Leonardo 2003, 36, 317–319. [Google Scholar] [CrossRef]
- Jensenius, A.R. Action-Sound: Developing Methods and Tools to Study Music-Related Body Movement. Ph.D. Thesis, University of Oslo, Oslo, Norway, 2007. [Google Scholar]
- Jensenius, A.R.; Kvifte, T.; Godøy, R.I. Towards a gesture description interchange format. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression, IRCAM—Centre Pompidou, Paris, France, 4–8 June 2006; pp. 176–179. [Google Scholar]
- Gritten, A.; King, E. Music and Gesture; Ashgate Publishing, Ltd.: Hampshire, UK, 2006. [Google Scholar]
- Gritten, A.; King, E. New Perspectives on Music and Gesture; Ashgate Publishing, Ltd.: Surrey, UK, 2011. [Google Scholar]
- Molnar-Szakacs, I.; Overy, K. Music and mirror neurons: From motion to ‘e’motion. Soc. Cognit. Affect. Neurosci. 2006, 1, 235–241. [Google Scholar] [CrossRef] [PubMed]
- Sievers, B.; Polansky, L.; Casey, M.; Wheatley, T. Music and movement share a dynamic structure that supports universal expressions of emotion. Proc. Natl. Acad. Sci. USA 2013, 110, 70–75. [Google Scholar] [CrossRef] [PubMed]
- Buteau, C.; Mazzola, G. From contour similarity to motivic topologies. Musicae Sci. 2000, 4, 125–149. [Google Scholar] [CrossRef]
- Clayton, M.; Leante, L. Embodiment in Music Performance. In Experience and Meaning in Music Performance; Oxford University Press: New York, NY, USA, 2013; pp. 188–207. [Google Scholar]
- Rahaim, M. Musicking Bodies: Gesture and Voice in Hindustani Music; Wesleyan University Press: Middletown, CT, USA, 2012. [Google Scholar]
- Huron, D.; Shanahan, D. Eyebrow movements and vocal pitch height: Evidence consistent with an ethological signal. J. Acoust. Soc. Am. 2013, 133, 2947–2952. [Google Scholar] [CrossRef] [PubMed]
- Savage, P.E.; Tierney, A.T.; Patel, A.D. Global music recordings support the motor constraint hypothesis for human and avian song contour. Music Percept. Interdiscip. J. 2017, 34, 327–334. [Google Scholar] [CrossRef]
- Godøy, R.I.; Haga, E.; Jensenius, A.R. Exploring Music-Related Gestures by Sound-Tracing: A Preliminary Study. 2006. Available online: https://www.duo.uio.no/bitstream/handle/10852/26899/Godxy_2006b.pdf?sequence=1&isAllowed=y (accessed on 16 January 2018).
- Glette, K.H.; Jensenius, A.R.; Godøy, R.I. Extracting Action-Sound Features From a Sound-Tracing Study. 2010. Available online: https://www.duo.uio.no/bitstream/handle/10852/8848/Glette_2010.pdf?sequence=1&isAllowed=y (accessed on 16 January 2018).
- Küssner, M.B. Music and shape. Lit. Linguist. Comput. 2013, 28, 472–479. [Google Scholar] [CrossRef]
- Roy, U.; Kelkar, T.; Indurkhya, B. TrAP: An Interactive System to Generate Valid Raga Phrases from Sound-Tracings. In Proceedings of the NIME, London, UK, 30 June–3 July 2014; pp. 243–246. [Google Scholar]
- Kelkar, T. Applications of Gesture and Spatial Cognition in Hindustani Vocal Music. Ph.D. Thesis, International Institute of Information Technology, Hyderabad, India, 2015. [Google Scholar]
- Nymoen, K.; Godøy, R.I.; Jensenius, A.R.; Torresen, J. Analyzing Correspondence Between Sound Objects and Body Motion. ACM Trans. Appl. Percept. 2013, 10, 9. [Google Scholar] [CrossRef][Green Version]
- Nymoen, K.; Caramiaux, B.; Kozak, M.; Torresen, J. Analyzing Sound-Tracings: A Multimodal Approach to Music Information Retrieval. In Proceedings of the 1st International ACM Workshop on Music Information Retrieval with User-centered and Multimodal Strategies (MIRUM ’11), Scottsdale, AZ, USA, 30 November 2011; ACM: New York, NY, USA, 2011; pp. 39–44. [Google Scholar]
- Ollen, J.E. A Criterion-Related Validity Test of Selected Indicators of Musical Sophistication Using Expert Ratings. Ph.D. Thesis, The Ohio State University, Columbus, OH, USA, 2006. [Google Scholar]
- Nymoen, K.; Torresen, J.; Godøy, R.; Jensenius, A. A statistical approach to analyzing sound-tracings. In Speech, Sound and Music Processing: Embracing Research in India; Springer: Berlin/Heidelberg, Germany, 2012; pp. 120–145. [Google Scholar]
- Schmuckler, M.A. Components of melodic processing. In Oxford Handbook of Music Psychology; Oxford Uniersity Press: Oxford, UK, 2009; p. 93. [Google Scholar]
- Godøy, R.I. Gestural affordances of musical sound. In Musical Gestures: Sound, Movement, and Meaning; Routledge: New York, NY, USA, 2010; pp. 103–125. [Google Scholar]
- Paschalidou, S.; Clayton, M.; Eerola, T. Effort in Interactions with Imaginary Objects in Hindustani Vocal Music—Towards Enhancing Gesture-Controlled Virtual Instruments. In Proceedings of the 2017 International Symposium on Musical Acoust, Montreal, QC, Canada, 18–22 June 2017. [Google Scholar]
- Paschalidou, S.; Clayton, M. Towards a sound-gesture analysis in Hindustani Dhrupad vocal music: Effort and raga space. In Proceedings of the ICMEM, University of Sheffield, Sheffield, UK, 23–25 March 2015; Volume 23, p. 25. [Google Scholar]
- Pearson, L. Gesture in Karnatak Music: Pedagogy and Musical Structure in South India. Ph.D. Thesis, Durham University, Durham, UK, 2016. [Google Scholar]
|1||VerticalMotion||z-axis coordinates at each instant of each hand|
|2||Range||(Min, Max) tuple for each hand|
|3||HandDistance||The euclidean distance between the 2d coordinates of each hand|
|4||QuantityofMotion||The sum of absolute velocities of all the markers|
|5||DistanceTraveled||Cumulative euclidean distance traveled by each hand per sample|
|6||AbsoluteVelocity||Uniform linear velocity of all dimensions|
|7||AbsoluteAcceleration||The derivative of the absolute velocity|
|8||Smoothness||The number of knots of a quadratic spline interpolation fitted to each tracing|
|9||VerticalVelocity||The first derivative of the z-axis in each participant’s tracing|
|10||CubicSpline10Knots||10 knots fitted to a quadratic spline for each tracing|
|1||SignedIntervalDirection||Interval directions (up/down) calculated for each note|
|2||InitialFinalHighestLowest||Four essential notes of a melody: initial, final, highest, lowest|
|3||SignedRelativeDistances||Feature 1 combined with relative distances of each successive note from the next, only considering the number of semitones for each successive change.|
|1||Dominant hand as needle||Right hand QoM much greater than left QoM||»⋁ «||0.50||0.06|
|2||Changing inter-palm distance||Root mean squared difference of left, right hands in x||RMS(LHX) − RMS(RHX)||0.64||0.12|
|3||Lateral symmetry between hands||Nearly constant difference between left and right hands||RHX − LHX = C||0.34||0.11|
|4||Manipulating a small object||Right and left hands follow similar trajectories in x||RH(x,y,z) = LH(x,y,z) + C||0.72||0.07|
|5||Drawing arcs along circles||Fit of (x,y,z) for left and right hands to a sphere||+ +||0.17||0.04|
|6||Percussive asymmetry||Dynamic time warp of (x,y,z) of Left, Right Hands||dtw(RH(xyz), LH(xyz))||0.56||0.07|
|Strategy 1 vs. rest||0.003|
|Strategy 2 vs. rest||0.011|
|Strategy 3 vs. rest||0.005|
|Strategy 4 vs. rest||0.487|
|Strategy 5 vs. rest||0.003|
|Strategy 6 vs. rest||0.006|
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).