Abstract
When ‘The dopamine theory of attention deficit hyperactivity disorder (ADHD)’ was published in the Australian and New Zealand Journal of Psychiatry [1], the predominant theory of ADHD was noradrenergic, possibly because of the limited efficacy of l-dopa and dopamine agonists for symptoms of ADHD. Despite this, the animal data indicated a significant role for dopaminergic systems as reviewed in 1991 [1].
The intervening years have seen a number of advances in genetic, neuropsychological, pharmacological and brain imaging investigations of ADHD. The present review aims to outline these advances and examine questions raised by them, such as the involvement of dopaminergic and other neurotransmitter systems in ADHD. Also, ‘hyper’ versus ‘hypo’ dopaminergic theories, and the relationship between anterior and posterior attention systems are discussed. Recent reviews of the neurobiology of ADHD [2, 3] concluded that while there is no single pathophysiological profile of ADHD, many data implicate dysfunction in fronto-subcortical pathways, which control attention and motor behaviour. The effectiveness of stimulants, along with animal models, point to catecholamine dysregulation as at least one source of ADHD brain dysfunction [3].
Barkley [4] proposed a unitary theory of ADHD, which involves a core central deficit in inhibition. This deficit is secondarily linked to five neuropsychological functions, namely prolongation/working memory, internalization of speech, self-regulation of affect, reconstitution and motor control/fluency. These functions are thought to be broadly representative of the more general concept of executive function (EF). Barkley [4] has suggested that the inattention (ADHD-I) subtype may represent a separate distinct disorder, with more problems with selective attention, sluggishness and memory retrieval, as well as problems with maths, language and reading.
A related phenomenon is that of inhibitory control, described by Schachar and Logan [5]. They used a ‘stopsignal’ paradigm as a laboratory analogue of a situation requiring inhibitory control. Subjects engaged in a primary task (e.g. forced-choice letter discrimination) are presented with an occasional stop-signal stimulus (e.g. a tone) instructing them to inhibit their response to the primary task stimulus (i.e. the ‘stop’ task). Variation in the delay between occurrence of the stop signal and the subject's suspended time of response (time to inhibit response) to the ‘go’ task affects the probability of inhibition on this task. For children diagnosed with attention deficit disorder with hyperactivity (ADHD), the inhibiting mechanism was triggered less frequently, was substantially more variable and was slower. The authors suggested that a specific deficit in inhibitory control might underlie the impulsivity of pervasively hyperactive children.
Casey et al. [6] showed significant differences in performance by children with ADHD and controls on three response inhibition tasks. Task performance was correlated with magnetic resonance imaging measures of right fronto-striatal circuitry. They suggested a role for the right prefrontal cortex in suppressing responses to salient but otherwise irrelevant stimuli, while the basal ganglia appeared to be involved in execution of responses.
Heilman et al. [7] proposed that children with ADHD have a right-sided frontal-striatal dysfunction due to impairment of the mesocortical dopamine system. They proposed that impulsiveness and hyperactivity might reflect a pathologically low threshold for gating behaviour, defined largely by exteroceptive stimuli, and that methylphenidate might redress the imbalance in favour of behaviour defined largely by interoceptive stimuli relevant to the task at hand.
Palumbo et al. [8] hypothesized a range of genetic and environmental conditions that interfere with normal basal ganglia development, giving rise to a developmental basal ganglia syndrome resembling Tourette's syndrome, including ADHD, and emphasizing the importance of the basal ganglia in control and inhibitory phenomena.
Thus prefrontal/striatal systems would appear to have important inhibitory functions, while the basal ganglia may also be involved in response execution.
Dopamine theories (‘hyper’ vs ‘hypo’)
For the clinician, it has sometimes seemed paradoxical that while observations of hyperactive ADHD children show them to move rapidly, laboratory measures of reaction time show slower reaction times thancontrols [9–11]. While a number of areas of ADHD research might cast some light on this apparent contradiction, it has not been directly addressed. For example, if dopamine is associated with motor activity, why would a dopaminergic deficit result in hyperactivity, rather than hypoactivity?
Neuroimaging studies [6, 11, 12] have directed attention to morphological and physiological differences in right-sided, prefrontal-striatal systems, which are rich in dopaminergic innervation. Genetic studies have suggested associations of ADHD with both the dopamine transporter DAT [13, 14], and dopamine DRD4receptor 7-repeat allele [15–17].
Castellanos [18] and Swanson and Castellanos [19] proposed that presynaptic effects may predominate in D2-rich subcortical regions, where presynaptic receptors are abundant, producing decreased synaptic dopamine, while postsynaptic effects may predominate in D4-rich cortical regions, which lack presynaptic receptors, producing increased synaptic dopamine.
Castellanos [18] suggested a model whereby dopamine neurones originating in the ventral tegmental area (VTA) diffusely innervate the frontal cortex, forming the mesocortical dopamine system, which largely lacks inhibitory autoreceptors. He suggested that these dopaminergic terminals were ideally positioned to regulate cortical inputs, thus improving signal-to-noise ratio for biologically valuable signals. Conversely, symptoms of hyperactivity/impulsivity in children with ADHD were hypothesized to be associated with relative overactivity of the nigral-striatal circuit, which is tightly regulated by inhibitory autoreceptors, as well as by long-distance feedback from the cortex. These differences were thought to explain differential dose–response effects of stimulant medications, where low doses, acting at striatal level might produce therapeutic inhibition of dopaminergic neurotransmission, while non-therapeutic high doses, especially if delivered intravenously or intranasally, might overwhelm this inhibitory effect.
Gray et al. [20] have elaborated an important model, describing intercorrelations between the basal ganglia and limbic system. They postulate that the accumbens system operates in tandem with the caudate system to permit switching from one step to the next in a motor program. The activities of caudate, accumbens and septohippocampal systems are thought to be coordinated and kept in step by the prefrontal cortex, acting by way of its connections with: (i) the cortical components of the caudate system and the superior colliculus; (ii) the nucleus accumbens, dorsomedial thalamus and amygdyla; and (iii) the entorhinal and cingulate cortex. The maintenance of excitatory activity in a subset of striatal, thalamic and cortical neurones is periodically interrupted by dopaminergic inputs to the striatum, with timing of activity coordinated between a septohippocampal monitoring system, and basal ganglia motor programming system.
Gray et al. [20] attribute the positive symptoms of schizophrenia to a hyperdopaminergic state, which interferes with the above predominantly anterior systems. This raises the question of a possible dopamine-related deficit in the above systems in ADHD, possibly a hypodopaminergic state, which responds positively to stimulant administration. The Gray et al. [20] hypothesis that the accumbens and caudate motor systems are responsible for monitoring and timing of sequential steps of motor programs implies a basic subcortical deficit in schizophrenia. Animal and human evidence on the abolition of latent inhibition, Kamin's blocking effect and partial reenforcement extinction (all three are instances of ‘the influence of stored memories of regularities of previous input, on current perception [21]) by dexamphetamine, are seen as models of the positive symptoms of schizophrenia. While converse animal data are not available for ADHD, Levy and Hobbes [22, 23] have shown that haloperidol blocks the normalizing effect of methylphenidate in ADHD children tested on a continuous performance test, supporting a hypodopaminergic hypothesis.
According to LeMoal and Herve [24] it is likely that low concentrations of extrasynaptic dopamine are involved in temporal integration of external cues and motor performance, including spatial short-term memory. They state that dopamine neurones have a homeostatic and regulatory role, in that they allow forebrain and cortical neurone systems to function normally, by activating the final common pathway of several integrative processes. Dopamine projections of the ventral and dorsal striatum may act as filtering and gating mechanisms for signals from the neocortex, which have to be synchronized and eventually translated into motor acts.
Seeman and Madras [25] have reviewed the mechanisms of antihyperactivity medications (methylphenidate and amphetamine). They have proposed that clinically relevant doses of stimulants might increase extracellular background levels of dopamine above that of action potential released dopamine, and thus reduce psychomotor activity.
This effect is consistent with a model developed by Grace [26] who found that when dopamine (DA) is released into the striatal synaptic cleft in response to action potentials, it is rapidly removed from the synapse by a highly efficient re-uptake system into the terminal. According to Grace, tonic DA is also present in the extrasynaptic fluid of the striatum, causing a steady-state partial activation of the very sensitive D2DA receptors, located on DA neurone terminals. Tonic DA level is thought to be mediated by stimulation of presynaptic heteroreceptors on DA terminals, by corticostriatal glutamergic projections. Tonic DA receptor stimulation is able to activate homeostatic changes by exerting a suppressive influence on subcortical DA systems. The rapid response of children with ADHD to stimulant medications could be explained by the release of dopamine, altering tonic/phasic dopamine relationships to achieve more optimal regulatory levels.
Grace [27] has postulated that the hyperactivity and impulsivity of ADHD result from abnormally low tonic DA activity within the ventral striatum/nucleus accumbens, leading to abnormally high phasic DA responses. The model is supported by the proposed regulation of tonic DA by frontal cortical afferents to the nucleus accumbens. Also, the dorsolateral prefrontal cortex, which projects to the striatum, is the cortical area and is believed to be responsible for the proper function of working memory (Goldman-Rakic [28]). Thus a decrease in prefrontal cortical function in ADHD would be consistent, according to Grace, with the working memory deficits postulated in ADHD [29] (i.e. a ‘relative’ hypodopaminergic hypothesis). The recent important text edited by Solanto, Arnsten and Castellanos [30], which includes the above hypothesis, comprehensively reviews the basic and clinical neuroscience of stimulant drug actions, and implications for theories of ADHD.
Vaidya and Gabrieli [31] have described a functional magnetic resonance (f MRI) study of ADHD subjects and controls, during two Go/No Go tasks. They found atypical frontal-striatal function and different effects of methylphenidate in ADHD versus healthy subjects. They postulated atypical dopaminergic modulation of the striatum, but also reported differential effects depending on task difficulty.
Volkow et al. [32] used positron emission tomography (PET) and [11C]-methylphenidate to define the distribution of methylphenidate in the human brain. The highest concentration was observed in the striatum, the brain area with the highest concentration of dopamine transporter.
Thus dopaminergic transmitters appear to be important in anterior fronto/striatal systems described above, where a ‘relative’ hypodopaminergic deficit may affect inhibition and working memory. The Levy [1], Castellanos [18] model based on differential dose effects of stimulants is consistent with the work of Grace [26, 27]on differential tonic/phasic effects of dopamine at striatal/ accumbens synaptic levels.
Noradrenergic theories
Oades [33] reviewed the role of noradrenaline in tuning (biasing of signal-to-noise ratio) and dopamine in switching (between inputs and outputs to specific brain regions) between signals in the CNS. He suggested that low doses of amphetamine appear to promote the probability of switching (alternate influences) in rat behaviour, but if DA activity is raised for a longer period, the system may be biased to a normally suppressed input, in the presence of particular stimuli, giving rise to stereotype.
The work of Arnsten and colleagues [34, 35] has clarified the role of α2-noradrenergic mechanisms in ADHD. They have hypothesized reciprocal connections between the prefrontal cortex (PFC) and locus coeruleus (LC). Previous studies had focused on the significance of DA input to the PFC, but research in monkeys indicated beneficial effects of α2 NE agonists on delayedresponse performance in aged monkeys. As distinct from α2B and α2C agonists which have sedative and hypotensive actions, guanfacine, which is a morespecific α2A agonist was shown to enhance cognitive performance in monkeys, possibly by inhibiting the response of auditory cortex neurones to irrelevant tones. Their model suggests that α2 stimulation may benefit cognitive function by actions on reciprocal circuits between the PFC and LC. They postulate direct effects on LC neurones, decreasing spontaneous firing related to stimulus-evoked activity, and indirect effects by improving PFC regulation of the LC.
Arnsten et al. [35] have described research in rodents and primates which indicates that noradrenaline has an important influence on spatial working memory and attentional functions of the PFC. They have shown beneficial effects of guanfacine on cognitive tasks, such as delayed response, delayed alternation and delayed match to sample with repeated stimuli, and believe these effects are mediated via the postsynaptic α2A receptor. Furthermore, the beneficial effect of α2 agonists on cognitive performance is reversed by the α2/A/B/C antagonist yohimbine, but not the α1, α2/B/C antagonist prazosine. When a selective α1 adrenergic, phenylephrine was infused into the PFC of rats performing a spatial working memory task, performance was impaired at higher delays. Their data was thought to suggest that α1 and α2 receptors might have opposing roles in the prefrontal cortex, as they do in the thalamus in regulating arousal. Arnsten et al. [36] proposed that postsynaptic α2A receptor stimulation inhibits irrelevant and distracting sensory processing through effects on pyramidal cells that project to sensory association cortices. This suggests that guanfacine, an α2A agonist may be of benefit to ADHD children, and may suggest an adrenergic role in working memory.
Other neurotransmitters
Gainetdinov et al. [37] reported that serotonergic agents potentiated the calming effect of psychostimulants in a knockout mouse model of hyperactivity, in which the dopamine transporter (DAT) gene was disrupted.
Anterior and posterior attention systems
According to Pardo et al. [38] the vigilance system in humans encompasses both right prefrontal and right superior parietal cortices, and can operate independently of midline (anterior cingulate) attentional systems. Pardo et al. [38] described the localization of a human system for sustained attention by PET. They used PET measurements of brain blood flow in healthy subjects to identify changes in regional brain activity during simple visual and somatosensory tasks of sustained attention or vigilance. They found localized increases in blood flow in the right prefrontal and right superior parietal cortex, regardless of the modality or laterality of sensory input. The neural system activated during vigilance tasks is related to the ‘on-line’ analysis of the stimuli for relevant target properties. The authors point out that work on the monkey [39, 40] has identified somatosensory and visual cortical fields specialized for attentive analyses of sensory features, but whether monkeys and humans have homologous neural circuits is still unknown. Thus while some investigators describe an anterior striatal/cingulate/ prefrontal system, others describe a lateralized right posterior system.
Goldman-Rakic [39] described the role of the primate prefrontal cortex in spatial cognition, from the points of view of its multiple levels of connectivity with major neurological centres. Most important, the prefrontal cortex was found to be integral to delayed-response tasks, which required behaviour to be guided by representations of discrimination stimuli rather those stimuli themselves (i.e. working memory). Lesions of the principal prefrontal sulcus in humans and monkeys resulted in a loss of this previously acquired ability to guide behaviour by internal representations. (This ability is similar to that of Piaget's AB Stage IV, Object Permanence Test [41].)
Goldman-Rakic [39] proposed that the ability to guide behaviour by representation required mechanisms for selecting pertinent information, and for holding thatinformation ‘on-line’ for the temporal interval, over which a decision or operation is to be performed (i.e. working memory), and for executing motor commands. Thus mechanisms are required for response initiation and inhibition (projections to striatum, tectum, thalamus and premotor cortex) as well as modulatory mechanisms (brain stem catecholamine projections). It was also proposed that connections between posterior parietal and prefrontal cortex are particularly relevant for the spatial-mnemonic processing of the type required in spatial delayedresponse tasks or spatial working memory. Anatomical studies revealed a precise topographically organized network of connections between particular sectors of the parietal cortex and the principal prefrontal sulcus [39].
Parietal-prefrontal projections are reciprocated with prefrontal-parietal pathways, by feed-forward and feedback pathways, representing a reverberating circuit for the short-term maintenance of the visuospatial representation needed in delayed response performance. Sawaguchi and Goldman-Rakic [40] showed that D1-dopamine receptors in the dorsolateral prefrontal cortex of monkeys participated in the maintenance of internalized visuospatial representations and/or in the control of eye movements governed by internal cues. They suggest that the D1 family of dopamine receptors may have a critical role in PFC-mediated working memory deficits.
While the work of Goldman-Rakic et al. [39] suggests that the principal prefrontal sulcus, in association with prefrontal/parietal networks is central for spatial working memory, Arnsten et al. [34–36] also implicate PFC/LC circuits in working memory. Primate experiments reported by Wilson and Goldman-Rakic [42] indicate that the prefrontal cortex is segregated into object and spatial domains, with separate connections. The posterior parietal cortex is concerned with spatial perception and the inferior temporal cortex with object recognition.
Pliszka et al. [43] described a posterior attention system, including the superior parietal cortex, superior colliculus and pulvinar, receiving dense noradrenergic innervation from the LC, and an anterior system, consisting of the prefrontal cortex and anterior cingulate gyrus. They postulated that the posterior system orientated to and engaged novel stimuli, while the anterior system subserved executive functions. According to Pliszka et al. [43], inability of noradrenaline to prime the posterior system, could account for attentional problems, while loss of dopamine's ability to gate inputs to the anterior system, might be linked to deficits in executive function in ADHD children. (See Pliszka [43], p.266] and Himelstein et al. [44] for anterior and posterior system diagrams). In particular, deficits in inhibition and working memory would appear to involve anterior systems, while deficits in orienting are related to posterior systems. However, there is evidence from the work of Goldman-Rakic that prefrontal-parietal projections, involving both anterior and posterior systems are important in working memory, which may require integration of spatial and temporal information.
Neuropsychological studies
Evidence from studies utilizing the Covert Orienting of Visuospatial Attention Task (COVAT), described by Posner and colleagues [45–47] also implicate posterior attention systems in ADHD. Swanson et al. [48] described a cued reaction time (RT) task, in which visual orienting and detection of targets in the peripheral left (LVF) and right (RVF) visual fields were cued by valid, invalid (positioning in opposite direction) and null cues. Cues were presented at 100 and 800 ms before target. Both ADHD and normal children performed a similar covert shift of visual-spatial attention in response to the cue in the 100 ms condition. For the 800 ms cue, RT decreased in normal subjects for all conditions relative to the 100 ms condition. However in the ADHD subjects, RT decreased for all LVF presentations, but only for RVF target presentations after the valid cue. Swanson and colleagues speculated that their data reflected a sustained attention deficit, which minimized the cost associated with an invalid RVF cue, and might be related to an anterior attentional network deficit.
Nigg et al. [49], in a study of a group of ADHD boys who were slower to respond to targets in the left than in the right visual field, found slowness for non-cued trials. They concluded that the results suggested a hypoarousal dysfunction of the noradrenergic system of the right hemisphere, giving rise to a rightward ‘bias’ of covert orienting. A low dose of methylphenidate equalized lateral difference by enhancing ‘no-cue’ response to the target in the left visual, more than in the right visual field.
McDonald et al. [50] attempted to clarify discrepancies in COVAT findings by designing a study in which responses were given to cued targets at valid and invalid locations at the 800 ms interval only. They found that ADHD children showed bilaterally greater ‘benefits’ from having attention directed to a cued location andgreater ‘costs’ in having to relocate attentional focus than controls. The finding of normal COVAT performanceat 100 ms intervals, and unreliable performance over 200 ms intervals in ADHD children, has been replicated [51, 52].
Some of the above findings may be explained by a series of experiments described by Jonides and Somers [53] to investigate voluntary versus automatic control over the ‘mind's eye’ movement. Experiments [53–56] supported the hypothesis that attention shifts can be guided by two mechanisms. On the one hand, certain salient stimuli have reflexive control over attention allocation, such that when one of these stimuli occurs, a shift of attention to the stimulus is automatically elicited. On the other hand, subjects have internal control over the spatial allocation of attention, so that when motivated, they can voluntarily shift attention from one part of the field to another. In the current context, orienting of attention to peripheral cues at longer cue-to-target intervals requires more voluntary control over attentional systems. A difficulty in suppressing reflexive shifts of attention, induced by peripheral cues, may also help explain why the observed behaviour of ADHD children appears to be controlled by random peripheral stimuli in the environment.
Carte et al. [9] found that when tasks requiring automatic processing were paired with tasks requiring greater use of selective attention, the latter controlled processing tasks differentiated ADHD children better than automated tasks.
Corbetta et al. [57] carried out a PET study to identify neural systems involved in shifting spatial attention to visual stimuli in left or right visual fields, utilizing valid and invalid cues. Positron emission tomography evidence showed that shifts of visuospatial attention selectively activated regions of the superior parietal and frontal cortex. Both superior parietal and frontal activations encoded the visual field, rather than the direction of an attention shift. The parietal region was active when attention was shifted on the basis of cognitive or sensory cues, independent of the execution of an overt response, while the frontal region was active only when selected lateralized stimuli were overtly detected.
The importance of right temporo-parietal regions for spatial attention is also shown by studies of the neglect syndrome. A review by Heilman et al. [58] found that observations of neglect were compatible with the hypothesis that the right hemisphere (parietal lobe) dominates comparator or attentional processes. The right parietal lobe was found to desynchronize equally to right or left sided stimuli, whereas the left parietal lobe desynchronized mainly to right side stimuli.
Posner and Dehaene [59] have pointed out that attention is neither the property of a single brain area, nor of the entire brain. Attentional effects are mediated through enhancement of attended, or suppression of unattended, items, depending on the task or brain area studied. The origins of amplification effects are found in specialized cortical areas of the frontal and parietal lobes. Posner and Raichle [61] describe ‘networks of attention’ related to orienting, inhibition and vigilance. They suggest a role for the parietal lobe (particularly right parietal) in orienting of attention. On the other hand ‘executive control’ is maintained by frontal systems, including the anterior cingulate and the prefrontal areas described by Goldman-Rakic [39]. These areas are involved in inhibition and working memory. Posner and Raichle [61] also describe a vigilance network whose task is the maintenance of a sustained state of alertness, and is related to noradrenergic activity from the LC to the right frontal lobe. Anatomically the LC has strong connections with the parietal lobe, pulvinar and colliculus: all part of the visual orienting system. Posner and Raichle [61] suggest that when a target is detected by the executive network, information other than the target may be processed at a high level, but the attention networks are not available to provide high priority access to non-targets. They believe the executive network is related to subjective experience of awareness or consciousness. Thus, neuropsychological studies indicate the importance of right parietal systems in orienting attention, while executive control is maintained by frontal systems.
Discussion
Tucker and Williamson [60] postulated that the noradrenergic system, originating in the LC and projecting rostrally via the median forebrain bundle to the limbic system, thalamus and neocortex, was suitable for a general regulatory system. This system, which responds to novel external stimuli, modulates its own activity by a decline in activity after stimulus repetition. Conversely, the dopaminergic pathway, is a more lateralized system, which manifests increased redundancy, or restriction of unnecessary information. According to Tucker and Williamson, dopaminergic frontal redundancy modulation is balanced by arousal, which is primarily noradrenergic. The recent studies of Arnsten on LC/PFC noradrenergic and Goldman-Rakic on dopaminergic systems cited above are consistent with this formulation. In particular, the role of dopaminergic systems in working memory and inhibitory processes are of central importance in ADHD, but noradrenergic transmitters are also involved orientation to novel stimuli, and may also be involved in working memory. The present review indicates an important role for the right parietal cortex in spatial orienting of attention, and separates the concept of executive function (awareness) from the additional networks involved in the automatic processing of attention. In other words, executive function and attention are related, but not identical.
The present review also indicates a more central role for working memory deficits in ADHD than has previously been suggested [29], which may require integration of both spatial and temporal information. This raises questions in relation to Barkley's [4] unitary theory of ADHD, which attributes general regulation to inhibitory systems. As outlined above these appear to be frontostriatal and dopaminergic, while orienting networks appear to be noradrenergic. While executive functions may be mediated via frontal networks, the symptoms and subtypes of ADHD [62] may reflect broader deficits, including the integration of spatial and temporal information.
The above models provide a rationale for recent molecular genetic associations of ADHD with dopamine transporter and receptor genes, and also direct attention to noradrenergic mechanisms.
At a practical level, the review also raises the question of whether guanfacine (an α2A agonist not currently available in Australia) might have a useful role in ADHD, compared with the α1 agonist clonidine. A comparison with a stimulant such as methylphenidate might help clarify differential dopaminergic versus noradrenergic effects. Modern brain imaging techniques, such as PET have shown receptor distribution [32], while steady-state visually evoked potentials (SSVEP) [63] and functional magnetic resonance imaging (f MRI) [31], in association with neuropsychological measures, promise to provide further insights into central nervous system networks involved in activation, orienting and controlled processing in ADHD.
