Abstract
Music is a prevalent part of everyday life and there has been a great deal of interest in the possibility that music facilitates cognition, including memory. Listening to background music has a modulatory effect on internal mood and arousal states, putting the listeners at the optimal levels necessary to enhance memory performance. However, there has been little research on how music-induced mood and arousal influence other aspects of cognition, in particular attention. The aim of the current study was to examine the effect of background music on visual attention. Participants rated an assortment of music clips on mood and arousal levels. The clips that participants rated most positive or negative in mood and highest or lowest in arousal were used during an adaptation of the Posner cueing task (Posner, 1980). This visual attention task was either performed in silence or while listening to background music. A significant interaction between mood and arousal was observed. Participants were fastest when listening to high arousal positive music and slowest when listening to high arousal negative music. Intermediate performance occurred for low arousal negative and low arousal positive music. Thus, changes in music-induced mood and arousal can indeed alter reaction times, with opposite effects observed for high arousal music based on whether it is perceived as positive or negative in mood. However, there is no evidence that musical mood and arousal affect attention because mood and arousal levels do not alter the effect of congruency on either reaction times or accuracy. Thus, although reaction times are faster in the presence of high arousal positive music, this appears unrelated to effects on attention.
Introduction
Music is a ubiquitous part of the human experience. Many everyday tasks such as studying, working, or driving are carried out in the presence of background music. However, it is unclear how these cognitive processes are affected by underlying music. Several studies have found that listening to music can facilitate cognitive processes such as learning and memory (Greene et al., 2010; Husain et al., 2002; Thompson et al., 2001), but only a couple of studies have investigated other cognitive processes, such as attention (Chen et al., 2013; Jefferies et al., 2008). Therefore, the current study set forth to examine the effect of background music on visual attention.
One potential mechanism that may explain how background music influences attention is that background music modulates the listener’s internal mood and arousal levels, which may then enhance performance (Aquino & Arnell, 2007; Husain et al., 2002; Jefferies et al., 2008). Mood refers to a positive or negative emotional state (e.g., happy and sad), whereas arousal is related to the level of physiological activation (e.g., energetic or tired). Investigating how background music affects attention is especially relevant as it is a crucial requirement for many complex cognitive tasks, such as driving. Therefore, the current study used musical stimuli, which modulated mood and arousal and subsequently observed the recorded impact on task-related performance.
Mood and Arousal
Mood and arousal represent different but related aspects of emotions. According to the
Although, the measures used in this study fit well into the arousal/valence terminology used in Russell’s
In the current study, musical mood and arousal were manipulated in order to examine their effects on attention. To our knowledge, only a few studies have examined the combined effects of music-induced mood and arousal on cognition generally (Greene et al., 2010; Husain et al., 2002; Thompson et al., 2001), and even fewer specifically on attentional processes (Jefferies et al., 2008). It is important to study these constructs together given that all music falls on a spectrum relative to the level of induced mood and arousal, which are not mutually exclusive (Russell, 1980). Thus, studying them in isolation is not representative of their rich modulatory and interactional effects on cognition, which most likely occur in real-world settings. For example, modulating these constructs separately does not significantly affect memory performance (Husain et al., 2002) whereas manipulating them together does (Greene et al., 2010; Husain et al., 2002; Thompson et al., 2001). Consequently, the effects of mood and arousal are examined independently and in conjunction here.
Mood and Attention
In the literature, the effects of mood and arousal on attention are typically manipulated independently (holding either mood or arousal constant). When mood is manipulated and arousal is held constant, participants in a positive mood frequently display enhanced attention, or a broadening of attention breadth, compared with those in a negative mood (Ashby et al., 1999; Chen et al., 2013; Rowe et al., 2007). More specifically, participants shown positive pictures from the International Affective Picture System (IAPS) performed significantly better on an attentional blink task than those shown negative pictures (Olivers & Niewenhuis, 2006). The attentional blink task is a temporal attention paradigm, which relies on detecting a particular target over time, in a series of distractors. Notably, people tend to find it difficult to identify a target when it follows a non-target, particularly when the time between the first and second stimulus is between 0 and 300 milliseconds (Shapiro et al., 1997). In this case, positive mood reduced the ‘attentional blink’ effect, or the difficulty of identifying a second target when it is temporally close to the first, by dispersing attentional resources outside of the scope of the main task (Fredrickson & Branigan, 2005). That being said, positive mood induction can also prove detrimental to performance, as individuals in a positive mood are more susceptible to distractibility from irrelevant information (Biss et al., 2010; Dreisbach & Goschke, 2004; Schmitz et al., 2009). Positive mood promotes exploration of new information (Fredrickson, 1998, 2001), which can either benefit or impair performance depending on the task at hand.
On the other hand, negative mood has often been linked to poorer performance on a variety of cognitive tasks (Kovacs & Beck, 1977; Salovey, 1992; Sedikides, 1992). However, temporary negative mood inductions have yielded conflicting results (Isen, 1987, 1990; Olivers & Niewenhuis, 2006). Negative moods have been shown to narrow attentional breadth (Schmitz et al., 2009) and, in some cases, the narrowing of attentional breadth can lead to enhanced performance when compared to positive or neutral moods (Dreisbach & Goschke, 2004; Rowe et al., 2007; Schmitz et al., 2009). For example, although negative mood appears to be less effective as a retrieval cue than positive mood (Olivers & Niewenhuis, 2006), certain kinds of negative moods have been linked to more specific attentional focus and cue utilization (Fenske & Eastwood, 2003; Isen, 1990; Schmitz et al., 2009). In the classic flanker task (Eriksen & Eriksen, 1974), participants that were induced to be in a positive mood displayed greater distractibility in response to the flankers than people in a negative or neutral mood (Rowe et al., 2007). Therefore, a negative mood may be beneficial to performance when external and/or irrelevant information needs to be ignored. In general, according to the classic
In the majority of studies that have examined the effect of mood on attention, music was not used to induce mood. The use of background music as a method of inducing mood and/or arousal, however, is highly beneficial and well established within the literature (Gabrielsson, 2001; Krumhansl, 1997; Peretz, 2001). In fact, in a meta-analysis conducted using principal component analysis, the regulation of mood and arousal was found to be the most important benefit of music, rated higher than even self-awareness and social relatedness, or the idea that music offers a means of social connectedness (Schäfer et al., 2013). In particular, when music is described as happy and fast (in tempo) there is a noticeable improvement in mood and increased arousal, whereas music described as sad and slow negatively affected mood and decreased arousal (Husain et al., 2002). Further still, mood and arousal have an impact on cognition. For example, short-term memory performance is improved, in both children and adults, in the presence of pleasant and calming background music and diminished when listening to unpleasant and arousing background music (Greene et al., 2010; Hallam et al., 2002; Mammarella et al., 2007). Therefore, using musical stimuli in the current study as a method of modifying mood and arousal is highly effective.
Arousal and Attention
In general, arousal is important in regulating many cognitive processes (Cahill & McGaugh, 1998). When a stimulus or an event is perceived as important, arousal is increased to prepare the body for action (Yoon et al., 1999). According to the
It seems that different tasks require different levels of arousal for optimal performance. Lower arousal is required for difficult or cognitively demanding tasks to facilitate attention and concentration, whereas higher arousal is required to increase motivation for tasks demanding endurance or persistence (Diamond et al., 2007). For example, on a visual search task, low arousal aided performance whereas high arousal proved detrimental (Smilek et al., 2006). High arousal stimuli also seem to capture attention better than low arousal stimuli, regardless of mood (Keil & Ihssen, 2004; Lang et al., 1993; Vogt et al., 2008). For example, when shown images that were rated high in arousal, participants chose to look longer at these images than those that were rated low in arousal (Lang et al., 1993). In sum, the effects of arousal on attention appear to be task-dependent, such as in the case of high arousal, which has been found to be both beneficial and detrimental to performance, depending on the task. We hope to examine these differences further by using music because it is an effective way to manipulate arousal (Schäfer et al., 2013).
Interaction of Mood and Arousal on Attention
Most of the existing research on mood and arousal has examined these constructs separately, however, little has been done with respect to their interaction. Evidence seems to support the importance of their interplay given that these properties are highly interrelated (
In previous work, mood and arousal were altered, using visual stimuli, to examine their effects on visual attention (Aquino & Arnell, 2007; Schimmack & Derryberry, 2005). Results indicated that the more arousing the visual stimulus, the more detrimental it was to performance, regardless of mood (Aquino & Arnell, 2007; Schimmack & Derryberry, 2005). In the Schimmack and Derryberry (2005) study, participants were shown images, varying in mood and arousal, while simultaneously solving mathematical problems or attempting to detect the location of a line. Overall, highly arousing pictures proved distracting to the performance of the tasks at hand, regardless of mood (positive or negative) (Schimmack & Derryberry, 2005). However, given the use of images that could be personally relevant (fears and sexual themes) these results should be interpreted with caution. This was highlighted in a research study conducted by Fernandes et al. (2011), which showed that highest levels of interference on digit parity tasks occurred in high arousal negative and low arousal positive conditions. Participants were significantly slower in high arousal negative and low arousal positive conditions, and the slower reactions times were accompanied by lower accuracy in the high arousal negative condition (Fernandes et al., 2011). However, there may be other factors to consider when examining attentional capture, particularly in the low arousal positive condition (Fernandes et al., 2011).
To our knowledge, only one research study to date has used musical stimuli to induce mood and arousal and examine the effect on attention (Jefferies et al., 2008). Musical excerpts were used to induce short-term emotional states: sad (low arousal negative mood), calm (low arousal positive mood), anxious (high arousal negative mood), and happy (high arousal positive mood). Next, participants performed an attentional blink task, where they were asked to identify letters and ignore digits. Participants in the anxious (HAN) group had the worst performance, consistent with research described above (Fernandes et al., 2011). Meanwhile, participants in the sad (LAN) group had the highest level of performance, inconsistent with the Fernandes et al. (2011) study, which found that the high arousal positive group had the best reaction times and accuracy. The interpretation for this inconsistency was that low arousal negative mood has a finer temporal resolution (Derryberry & Tucker, 1994; Gasper & Clore, 2002), which either leads to an enhanced ability to switch between tasks or filter external information more efficiently by ignoring irrelevant events (Jefferies et al., 2008; Olivers & Niewenhuis, 2006). Participants in the calm (LAP) and happy (HAP) groups had intermediate levels of performance. This study, therefore, highlighted the importance of assessing mood and arousal in combination, given that mood was dependent on the level of arousal in its effect on attentional performance and vice versa.
The Current Study
Here, we examined how mood and arousal affected visual attention. To distinguish from past research (Jefferies et al., 2008), music clips were selected based on individual ratings of perceived mood and arousal. Music is a very subjective experience (Christenson & Peterson, 1988); therefore, these ratings enabled us to account for individual differences in the perception of these musical properties. The current study used the subsequent ratings to create subject-specific databases for each participant, with music rated as: high arousal positive mood (HAP), high arousal negative mood (HAN), low arousal positive mood (LAP), and low arousal negative mood (LAN). Participants then completed a visual attention task, which is an adaptation of the Posner cueing paradigm (Posner, 1980). In previous research studies using musical stimuli, participants performed an attentional blink task (Jefferies et al., 2008), which may reflect a filtering mechanism or a consolidation bottleneck (Broadbent, 1958; Jefferies et al., 2008; Shapiro et al., 1997). Attentional blink tasks measure a different aspect of attention, or temporal attention. Posner (1980) cueing tasks, however, are specifically associated with the shifting of spatial attention. Therefore, the current experiment will be looking at a different aspect of attention. This ability to shift attention is relevant to real-world deployment of attention based on cues and may have a practical implication in mediating complex cognitive tasks, such as driving. Lastly, Jefferies et al. (2008) played music prior to participants performing the attention task, in contrast to the current study in which participants performed the task while listening to music because we believe this approach is a more naturalistic representation of how we experience music.
Based on previous literature, optimal performance (for both reaction time and accuracy) was predicted to occur when participants listened to low arousal negative music (Jefferies et al., 2008) and lowest when participants listened to high arousal negative music (Fernandes et al., 2011; Jefferies et al., 2008). The effects of positive mood regardless of arousal are less certain, but we predicted intermediate performances for low arousal positive and high arousal positive music with low arousal positive music potentially affecting reaction times but not accuracy (Fernandes et al., 2011). The effects of low arousal positive mood on the attentional field are of interest due to the inconsistency in the literature (Fernandes et al., 2011) and their potential effect on previous theories of attention such as the
Method
Participants
Fifty students (34 females) between the ages of 18 and 30 years (
Materials
Stimuli were presented using E-Prime 2.0 software (Psychology Software Tools, 2002) on a 14.0” DELL Vostro 3400 laptop and through Sennheiser HD 280 headphones.
Task and Procedure
Music Rating and Selection
During the first part of the study, participants listened to and rated 50 instrumental music excerpts (for a complete list of stimuli, see Appendix A), taken from a database validated in a prior study (Nguyen & Grahn, 2017). For the original database, experimenters selected music without lyrics from an assortment of musical genres (i.e., jazz, rock, metal, classical, and blues) (Nguyen & Grahn, 2017). Furthermore, they selected music that would most likely be unfamiliar to the participants, had high inter-rater agreement for both mood and arousal, and had mood and arousal levels that remained fairly stable over the duration of the clip (Nguyen & Grahn, 2017). Although the clips were 90 seconds each, participants only heard the first 10 s prior to making their rating. The excerpts were presented through headphones at a comfortable listening volume. All clips were normalized using version 1.3.4 of Audacity(R) recording and editing software (Audacity Team, 2007 [1]) (http://audacity.sourceforget.net) to ensure that they were similar in loudness.
After each clip, participants saw one of two rating scales on the screen: (1) a mood scale or (2) an arousal scale. The mood scale ranged from -3.00 (very negative) to +3.00 (very positive), with 0 designated as neutral mood. The arousal scale ranged from 1.00 (very low arousal) to 7.00 (very high arousal) with 4.00 being designated as moderate arousal. Participants were told to rate each musical excerpt according to what they thought the music expressed or conveyed and not what they personally felt when listening to the music. As a reference, participants were told that mood referred to the positive or negative nature of the music. For example, a positive piece could be associated with words such as “happy” or “calming”, and a negative piece with words such as “sad” or “angry”. Arousal referred to the energy level of the music, with high arousal pieces being associated with words such as “distressing” or “thrilling”, and low arousal pieces with words such as “relaxing” or “dull.”
Both the mood and arousal scales were displayed horizontally, left to right, in the middle of the laptop screen. Participants were instructed to rate each excerpt based on what the music expressed or conveyed. Furthermore, to ensure that the ratings were not confounded, the order of the ratings was counterbalanced across participants: they either rated the mood of all the excerpts, then the arousal, or vice versa. Therefore, participants heard each of the 50 excerpts twice: once to rate for mood and once to rate for arousal.
Prior to the analysis, responses from the arousal scale were converted to the -3.00 to +3.00 scale. Each participant’s individual ratings were used to assign music for that participant to the four music conditions: (1) high arousal positive (HAP), (2) high arousal negative (HAN), (3) low arousal positive (LAP), and (4) low arousal negative (LAN). The highest and lowest rated excerpts were taken for use in the attention task. Initially, a criterion of greater than +2.00 or less than -2.00 on the mood and arousal scales was used. If the initial criterion did not yield three stimuli for each of the four conditions, it was lowered or raised in increments of 0.25 points until each participant had a set of 12 music clips.
Attention Task
Following the music rating and selection process, participants completed a simple visual attention task. The task consisted of 15 blocks with 20 trials in each block and participants were able to rest between blocks, if necessary. Each trial comprised four stages: fixation, cue, target, and response. During the fixation stage, participants focused on a black cross, in the middle of a white screen, situated between two empty boxes outlined in black, for 1500 milliseconds (ms). In the cue stage, a black arrow pointing either to the right or to the left box appeared above the black cross. To reduce expectancy effects, the duration of the cue stage varied, in 100 ms steps, between 1000 and 4000 ms. Next, at the target stage, the black arrow disappeared and a target (a black star) appeared in either the right or left box for 1500 ms. After the target stimulus appeared, participants indicated as quickly as possible whether the trial was a congruent trial or ‘match’ (the target appeared in the box indicated by the direction of the cueing arrow) or an incongruent trial or ‘mismatch’ (the target appeared in the box that was not indicated by the cueing arrow) by pressing the appropriately marked key on the keyboard (Figure 1).

One (congruent) trial of the Posner cueing task, adapted from Posner (1980). Each trial consists of four stages: fixation, cue, target, and response.
Half of the trials in each block were congruent and half of the trials were incongruent. The trial order in each block was randomized. Finally, 12 of the 15 blocks were accompanied by music that was selected in the music rating and selection task (one musical excerpt per block), whereas three of the 15 blocks were completed in silence. There were four music conditions (HAP, LAP, HAN, and LAN), so there were three excerpts for each condition for a total of 12 excerpts. The music began at the start of each block and the excerpt continued to play (on loop if necessary) until the end of the experimental block. The music clips never looped more than once and were repeated, on average, 1.33 times (with the range between 1.02 and 1.69 times). Therefore, any effects due to the repetition of the music should be negligible. The order of the conditions was randomized. The experiment consisted of one session of approximately one hour, with ∼15 minutes for the music rating and selection task, ∼35 minutes for the attentional task and ∼10 minutes for set-up and debriefing.
Statistical Analysis
Response accuracy and reaction times were analyzed for each participant. Response accuracy was calculated as the percentage of correct responses. Any participants who performed at or below chance on more than half of the experimental blocks were excluded from the analysis. Reaction time was defined as the time between the target’s appearance and the participant’s response. Any responses under 100 ms or over 1000 ms were excluded from the final analysis because reaction times under 100 ms were assumed to be anticipatory responses (Thorpe et al., 1996) and reaction times over 1000 ms were not indicative of spontaneous responses (Hayward & Ristic, 2013). The analysis of reaction times included only trials with correct responses. Reaction time was calculated by taking the average median reaction time for a given condition for a given participant. Response accuracy and reaction times were analyzed with a 2 (
Results
Response Accuracy
Data from three participants were excluded from the analysis: two data sets were excluded because of technical issues and one data set was excluded because the overall response accuracy was less than chance for more than half the experiment blocks. The 2 x 2 x 2 ANOVA for response accuracy yielded a significant main effect of congruency,
Paired samples
Reaction Times
The 2 x 2 x 2 ANOVA for reaction times also yielded a significant main effect of congruency,

Mood and arousal interaction for reaction times. Reaction times were significantly faster for HAP music than HAN music, but LAP and LAN music did not differ, resulting in a significant interaction between mood and arousal. Error bars indicate standard error of mean (Cousineau, 2005).
The paired samples
Discussion
The current study investigated how background music, varying in mood and arousal, played simultaneously during an adaptation of the Posner cueing task (Posner, 1980) affected task performance and visual attention. This is in contrast to previous studies that used music to alter mood and arousal prior to the performance of a cognitive task (Greene et al., 2010; Husain et al., 2002; Thompson et al., 2001). We predicted that background music should affect visual attention, by modulating mood and arousal. In particular, best performance should occur in the presence of low arousal negative music and worst performance in the presence of high arousal negative music (Fernandes et al., 2011; Jefferies et al., 2008). Meanwhile, positive mood, regardless of arousal, should produce intermediate levels of performance (Jefferies et al., 2008).
First, accuracy did not significantly differ between any music condition and silence. In contrast, reaction times were significantly faster in the high arousal positive music condition compared to silence but did not differ between any other music condition and silence. For congruency, as expected, there was a significant main effect: participants were faster and more accurate on congruent trials than incongruent ones. Neither the mood nor arousal manipulations produced any significant main effects, however, the interaction between mood and arousal for reaction times was significant, driven by the fact that high arousal positive music reaction times were fastest, while high arousal negative music reaction times were slowest. Thus, high arousal positive music significantly decreased reaction times relative to high arousal negative music and also relative to silence.
Reaction time and accuracy may reflect different aspects of voluntary and involuntary attention (Prinzmetal et al., 2005). Voluntary attention requires strategic resource allocation, which enhances both the perceptual representation of the stimulus (hence, also enhances accuracy) and the allocation of attention to the cued location (hence, faster reaction time) (Prinzmetal et al., 2005). Voluntary attention therefore influences both response accuracy and reaction times. In contrast, involuntary attention is thought to reflect reflexive orienting and affect the decision to respond to the cued location in space, therefore not having an impact on the perceptual representation of the stimulus (Prinzmetal et al., 2005). Thus, involuntary attention affects reaction times, but not response accuracy. In the current version of the Posner cueing task, involuntary attention was tapped, by making the cue uninformative and non-predictive (50% of trials were congruent and 50% were incongruent), in contrast to the classic Posner paradigm, which uses an informative cue (80% predictive of the target location). This adaptation results in involuntary, reflexive responses to the cue, as participants still respond to the cue by shifting attention, but are aware that the cue is uninformative, so they do not need to allocate voluntary attention (Eimer, 1997; Hommel et al., 2001; Pratt & Hommel, 2003; Ristic et al., 2002; Tipples, 2002). Given that mood and arousal did not significantly affect response accuracy, this would suggest that we are not tapping into voluntary attention. In contrast, reaction times were affected by an interaction of mood and arousal thus suggesting that involuntary attention may be altered by mood and arousal states. However, to fully support this conclusion, a three-way interaction of mood, arousal, and congruency would need to be observed: larger congruency/incongruency differences imply greater attention is being deployed, and in the current study, the congruency/incongruency differences did not significantly differ across mood and arousal levels. Therefore, the effects of mood and arousal on performance likely arise from non-attentional factors, such as motor circuits or motor readiness that may be implicated in reaction time responses.
Consistent with past literature, a congruency effect was found on both accuracy and reaction times (Posner, 1980). Specifically, in support of previous findings, on congruent trials there was a significant increase in perceptual accuracy (Carrasco, 2011) and a decrease in reaction times (Coull & Nobre, 1998). Covert attention, as modulated by cue direction, to a particular location in space facilitates processing in this area and thus decreases reaction time (Posner et al., 1978). Furthermore, the presence of covert cues also leads to more intense processing of stimuli leading to increased detection rates and by extension perceptual accuracy (Prinzmetal et al., 2005). The results are therefore consistent with the literature in regards to congruency effects on performance. In terms of music conditions, there were no differences in accuracy related to mood or arousal, suggesting that the effects of music, when present, were restricted to enhanced processing of a given spatial location rather than stimulus-processing itself.
Past literature has used different types of attentional paradigms to assess the effect of music on attention. When participants were asked to identify two targets in a rapid sequence (an attentional blink task) in the presence of background music and silence, during the music condition, participants were better at identifying the second target in comparison to the silent condition (Olivers & Nieuwenhuis, 2006). In another attentional blink study that manipulated mood and arousal like the current study, listening to low arousal negative music produced the best performance on detecting the second target and high arousal negative music produced the worst performance (Jefferies et al., 2008). Meanwhile, both types of positive music produced intermediate performance (Jefferies et al., 2008). In the current study, we predicted that mood and arousal would interact to affect visual attention, and this prediction was supported: there was a significant interaction between mood and arousal on reaction times. Reaction times were shortest in the presence of HAP and LAN music and longest in the presence of LAP and HAN music. Furthermore, the decrease in reaction time in the presence of HAP music showed a benefit of music over silence. Thus, the fact that high arousal
In general, however, it appears that high arousal negative music may hinder attentional processing, regardless of tasks. Results for the attentional blink task showed second-target accuracy was lowest for those in a high arousal negative mood. Similarly, results for the cueing task showed slowest reaction time when participants were listening to high arousal negative music. Therefore, the interaction of high arousal and negative mood has a profound effect on the control of attention.
In previous studies, attentional performance was not compared to a silent condition (Jefferies et al., 2008). Therefore, a silent condition was included here. Overall, background music only improved reaction times when the music was high arousal and positive, otherwise background music had no effect compared to silence. Moreover, background music had no effect on accuracy compared to silence. Listening to music may be cognitively demanding (Nguyen & Grahn, 2017), and these demands may negate potentially positive benefits associated with particular mood and arousal levels. Another issue to consider is that transfer effects from music blocks to subsequent silent blocks may have occurred. The blocks were randomized in order, so that transfer effects of music with particular mood and arousal levels should have averaged out over the session, but any general effects of music (rather than specific effects of different combinations of mood and arousal levels) could have persisted.
An important factor to keep in mind when attempting to induce a particular mood or arousal state is the time required to effectively do so. In general, most studies suggest that one minute of music (Robinson et al., 2012) is enough to induce a given mood or arousal state (for review see Västfjäll, 2002). Our timing of the mood and arousal inductions was within these constraints, but it is possible that even ∼2 minutes may not have been enough (Eich & Metcalfe, 1989), or that the music did not provide a strong enough effect (Västfjäll, 2002). Furthermore, the mood or arousal state itself may vary over time (Västfjäll, 2002). Therefore, the music may not have significantly affected task-related performance because the strength of the induction was too weak, or participants required more exposure time. However, we believe that this is unlikely because music was played throughout the task and not simply prior to execution. In addition, participants rated the musical pieces prior to the task to measure induced mood and arousal and only the most effective pieces were selected for use in the experimental portion.
Finally, in regard to the generalization of the results of this study, we focused in particular on the findings of Jefferies et al. (2008), the only other study of which we are aware of that explored the interaction of mood and arousal on attention. The current study used participant ratings to select music, accounting in part for the subjectivity of musical experience. The use of tailored music would have been expected to strengthen the mood and arousal manipulation. However, we did not find such an effect compared to other studies. Another important difference was that Jefferies et al. (2008) had participants ruminate on thoughts consistent with the mood and arousal of the music they were listening to, potentially strengthening the effects of the manipulations.
In conclusion, we explored the interaction between musical mood and arousal on cueing of visual attention and found that mood modulated the effect of high arousal music, with high arousal positive music enhancing reaction times relative to high arousal negative music. Although we did not find a reliable effect of music on visual attention, high arousal positive music improved overall reaction times compared to silence. The practical implications of the findings suggest that listening to high arousal positive music may benefit reflexive reaction times. These results may have implications for the effect of music on everyday tasks that require rapid reactions, such as driving.
Supplemental Material
Supplemental Material, Appendix - Keep Calm and Pump Up the Jams: How Musical Mood and Arousal Affect Visual Attention
Supplemental Material, Appendix for Keep Calm and Pump Up the Jams: How Musical Mood and Arousal Affect Visual Attention by Angela Marti-Marca, Tram Nguyen and Jessica A. Grahn in Music & Science
Footnotes
Contributorship
AMM and TN conducted a literature review. AMM, TN, and JG designed the study. AMM and TN were involved in the study design and data analysis. AMM prepared the documents for ethics approval (revised by TN), recruited participants and collected data. AMM wrote the first draft of the manuscript. AMM, TN, and JG reviewed and edited the manuscript and approved the final version.
Declaration of conflicting interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Natural Sciences and Engineering Research Council of Canada.
Action Editor
Ian Cross, Faculty of Music, University of Cambridge.
Peer Review
Diana Omigie, Max-Planck-Institut für empirische Ästhetik.
Two anonymous reviewers.
Supplemental Material
Supplemental material for this article is available online.
Appendix A: Music Stimuli
A complete list of the musical excerpts used in the rating portion of this research study, with their average mood and arousal ratings.
| Music | Genre | Condition | Arousal rating | Mood rating | ||
|---|---|---|---|---|---|---|
|
|
|
|
|
|||
| 1. TemptingTime.wav | metal | HAN | 1.66667 | 1.356 | −1.53333 | 2.093 |
| 2. Bittersweet.wav | classical | LAN | −0.93333 | 1.407 | −1.86667 | 1.831 |
| 3. Burn1.wav | metal | HAN | 1.93333 | 1.486 | −1.26667 | 1.534 |
| 4. Burn2.wav | metal | LAN | 0.86667 | 1.521 | −1.20000 | 1.685 |
| 5. WorldsCollide.wav | metal | HAN | 1.73333 | 1.988 | −0.66667 | 1.438 |
| 6. Escape.wav | electronica | HAP | 2.06667 | 1.163 | 1.73333 | 1.335 |
| 7. KillerJoe.wav | jazz | LAP | −1.13333 | 0.990 | 0.86667 | 1.407 |
| 8. NotAPressure.wav | jazz | HAP | 1.66667 | 1.242 | 1.60000 | 1.047 |
| 9. Oleazinho.wav | pop | LAP | −0.33333 | 1.907 | 0.93333 | 1.397 |
| 10. PromQueen.wav | alternative/rock | LAP | −1.73333 | 1.792 | −1.06667 | 1.033 |
| 11. CliffsOfDover.wav | alternative/rock | HAP | 1.46667 | 1.373 | 2.20000 | 1.356 |
| 12. ComeHomeTo.wav | jazz | LAP | −1.00000 | 1.580 | 1.06667 | 1.414 |
| 13. Conclusion.wav | classical | LAN | −1.13333 | 1.922 | −1.53333 | 1.552 |
| 14. Ghosts.wav | electronica | HAP | 2.46667 | 1.125 | 2.13333 | 0.990 |
| 15. Consciousness.wav | alternative/rock | LAN | 1.53333 | 1.175 | −0.66667 | 1.506 |
| 16. HelloMyLovely.wav | jazz | LAP | −1.26667 | 1.521 | 0.80000 | 1.486 |
| 17. TheKingdomWithin.wav | jazz | HAP | −0.46667 | 1.163 | 0.93333 | 0.990 |
| 18. AllegrettoVivace.wav | classical | HAP | −0.33333 | 1.407 | 1.13333 | 1.839 |
| 19. MegaSnake.wav | alternative/rock | HAN | 1.53333 | 1.897 | −0.80000 | 1.598 |
| 20. BlessedSpirits.wav | classical | LAP | −1.46667 | 1.633 | 0.66667 | 1.767 |
| 21. Akiko.wav | electronica | LAP | −0.53333 | 1.033 | 0.73333 | 1.187 |
| 22. WhatAmI2.wav | alternative/rock | LAN | −2.06667 | 1.280 | −1.06667 | 1.223 |
| 23. SatchBoogie.wav | alternative/rock | HAP | 2.26667 | 1.280 | 1.73333 | 1.335 |
| 24. SurfingAlien.wav | alternative/rock | HAN | 2.06667 | 1.685 | 1.53333 | 1.534 |
| 25. Kellot.wav | metal | HAN | 1.33333 | 1.100 | −1.26667 | 1.877 |
| 26. ResurrectionII2.wav | classical | HAN | −0.33333 | 1.183 | 1.66667 | 1.589 |
| 27. OrbitalElements.wav | metal | HAN | 1.80000 | 1.642 | −0.86667 | 1.699 |
| 28. QuasiAdagio.wav | classical | LAN | −1.40000 | 1.699 | −1.80000 | 1.882 |
| 29. ClubbedToDeath1.wav | piano | HAP | −1.20000 | 1.407 | −1.13333 | 1.474 |
| 30. SceneAuxChamps.wav | classical | LAP | −0.73333 | 1.543 | 0.33333 | 1.387 |
| 31. SFX.wav | electronica | HAN | 1.53333 | 2.042 | −1.20000 | 1.506 |
| 32. SymphonyG.wav | classical | LAP | −0.46667 | 1.242 | 1.60000 | 1.552 |
| 33. GiveYouAway.wav | alternative/rock | LAP | −0.53333 | 0.845 | 1.00000 | 1.506 |
| 34. Appalachian.wav | alternative/rock | HAP | 1.53333 | 1.060 | 2.06667 | 0.704 |
| 35. CAFO.wav | metal | HAN | 2.60000 | 0.632 | −0.73333 | 1.751 |
| 36. Carnival.wav | classical | HAP | 1.86667 | 1.187 | 1.73333 | 1.438 |
| 37. ChopinPreludeE.wav | classical | LAN | −1.73333 | 0.704 | −1.93333 | 1.335 |
| 38. DernierJour.wav | piano | LAN | −1.73333 | 0.961 | −0.93333 | 1.100 |
| 39. Enchanted.wav | pop | HAP | 1.33333 | 1.175 | 2.40000 | 0.828 |
| 40. FlikMachine.wav | pop | HAP | 1.86667 | 0.915 | 2.3333 | 0.617 |
| 41. HappySong.wav | pop | HAP | 1.13333 | 0.990 | 2.46667 | 0.640 |
| 42. ImWishing.wav | piano | LAP | −1.73333 | 0.961 | −0.86667 | 1.356 |
| 43. LionelRichie1.wav | metal | HAN | 1.26667 | 1.163 | −0.86667 | 1.598 |
| 44. Montenegro.wav | electronica | HAP | 2.33333 | 0.617 | 1.66667 | 1.047 |
| 45. PanicAttack.wav | metal | HAN | 2.86667 | 1.496 | −0.66667 | 0.352 |
| 46. PoliceFire.wav | classical | LAN | −2.46667 | 0.900 | −1.33333 | 0.743 |
| 47. SadPiano.wav | piano | LAN | −2.20000 | 1.767 | −1.13333 | 0.862 |
| 48. SensesComeAlive.wav | piano | LAP | −1.73333 | 1.291 | 0.33333 | 0.799 |
| 49. TarzanFight.wav | classical | HAN | 1.33333 | 1.335 | −1.73333 | 0.900 |
| 50. TigerDragon.wav | world | LAN | −2.33333 | 1.302 | −1.13333 | 0.488 |
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
