Abstract
Two studies investigated the development of the perception of emotion in music. In Study 1, preschool children and adults matched nine pieces of music to five photographed facial expressions (happy, sad, anger, fear and neutral). While children did not agree with the adult majority interpretation for most pieces, their pattern of responding to the music, both with photograph choices and spontaneous verbal labels, was similar to the adults. Important methodological differences between this and previous research could explain the inconsistencies. Study 2 used happy and sad music along with a dynamic visual display in an intermodal matching experiment with 5- to 9-month-old infants. Infants preferred the affectively concordant happy display but did not look longer to the affectively concordant sad display as predicted. Taken together, these results begin to explore how emotional perception from music may be due to innate perceptual predispositions together with learned associations that develop in childhood.
Get full access to this article
View all access options for this article.
