Global connectivity and local excitability changes underlie antidepressant effects of repetitive transcranial magnetic stimulation.
Neuropsychopharmacology : official publication of the American College of Neuropsychopharmacology
Repetitive transcranial magnetic stimulation (rTMS) is a commonly used treatment for major depressive disorder (MDD). However, our understanding of the mechanism by which TMS exerts its antidepressant effect is minimal. Furthermore, we lack brain signals that can be used to predict and track clinical outcome. Such signals would allow for treatment stratification and optimization. Here, we performed a randomized, sham-controlled clinical trial and measured electrophysiological, neuroimaging, and clinical changes before and after rTMS. Patients (N = 36) were randomized to receive either active or sham rTMS to the left dorsolateral prefrontal cortex (dlPFC) for 20 consecutive weekdays. To capture the rTMS-driven changes in connectivity and causal excitability, resting fMRI and TMS/EEG were performed before and after the treatment. Baseline causal connectivity differences between depressed patients and healthy controls were also evaluated with concurrent TMS/fMRI. We found that active, but not sham rTMS elicited (1) an increase in dlPFC global connectivity, (2) induction of negative dlPFC-amygdala connectivity, and (3) local and distributed changes in TMS/EEG potentials. Global connectivity changes predicted clinical outcome, while both global connectivity and TMS/EEG changes tracked clinical outcome. In patients but not healthy participants, we observed a perturbed inhibitory effect of the dlPFC on the amygdala. Taken together, rTMS induced lasting connectivity and excitability changes from the site of stimulation, such that after active treatment, the dlPFC appeared better able to engage in top-down control of the amygdala. These measures of network functioning both predicted and tracked clinical outcome, potentially opening the door to treatment optimization.
View details for DOI 10.1038/s41386-020-0633-z
View details for PubMedID 32053828
- Intervene or Innovate: a Dilemma for Psychiatrists-in-Training. Academic psychiatry : the journal of the American Association of Directors of Psychiatric Residency Training and the Association for Academic Psychiatry 2020
Neural Correlates of Anger Expression in Patients With PTSD
NATURE PUBLISHING GROUP. 2019: 87
View details for Web of Science ID 000509665600180
- Anger Expression in Patients With PTSD: Clinical, Cognitive, and Neural Correlates ELSEVIER SCIENCE INC. 2019: S137
- New Frontiers in Irritability Research-From Cradle to Grave and Bench to Bedside. JAMA psychiatry 2019
- Physicians Talking With Their Partners About Patients. JAMA 2019
Learning what to approach.
2018; 16 (10): e3000043
Most decisions share a common goal: maximize reward and minimize punishment. Achieving this goal requires learning which choices are likely to lead to favorable outcomes. Dopamine is essential for this process, enabling learning by signaling the difference between what we expect to get and what we actually get. Although all animals appear to use this dopamine prediction error circuit, some do so more than others, and this neural heterogeneity correlates with individual variability in behavior. In this issue of PLOS Biology, Lee and colleagues show that manipulating a simple task parameter can bias the animals' behavioral strategy and modulate dopamine release, implying that how we learn is just as flexible as what we learn.
View details for PubMedID 30307969
- Learning what to approach PLOS BIOLOGY 2018; 16 (10)
The Neural Basis of Aversive Pavlovian Guidance during Planning
JOURNAL OF NEUROSCIENCE
2017; 37 (42): 10215–29
Important real-world decisions are often arduous as they frequently involve sequences of choices, with initial selections affecting future options. Evaluating every possible combination of choices is computationally intractable, particularly for longer multistep decisions. Therefore, humans frequently use heuristics to reduce the complexity of decisions. We recently used a goal-directed planning task to demonstrate the profound behavioral influence and ubiquity of one such shortcut, namely aversive pruning, a reflexive Pavlovian process that involves neglecting parts of the decision space residing beyond salient negative outcomes. However, how the brain implements this important decision heuristic and what underlies individual differences have hitherto remained unanswered. Therefore, we administered an adapted version of the same planning task to healthy male and female volunteers undergoing functional magnetic resonance imaging (fMRI) to determine the neural basis of aversive pruning. Through both computational and standard categorical fMRI analyses, we show that when planning was influenced by aversive pruning, the subgenual cingulate cortex was robustly recruited. This neural signature was distinct from those associated with general planning and valuation, two fundamental cognitive components elicited by our task but which are complementary to aversive pruning. Furthermore, we found that individual variation in levels of aversive pruning was associated with the responses of insula and dorsolateral prefrontal cortices to the receipt of large monetary losses, and also with subclinical levels of anxiety. In summary, our data reveal the neural signatures of an important reflexive Pavlovian process that shapes goal-directed evaluations and thereby determines the outcome of high-level sequential cognitive processes.SIGNIFICANCE STATEMENT Multistep decisions are complex because initial choices constrain future options. Evaluating every path for long decision sequences is often impractical; thus, cognitive shortcuts are often essential. One pervasive and powerful heuristic is aversive pruning, in which potential decision-making avenues are curtailed at immediate negative outcomes. We used neuroimaging to examine how humans implement such pruning. We found it to be associated with activity in the subgenual cingulate cortex, with neural signatures that were distinguishable from those covarying with planning and valuation. Individual variations in aversive pruning levels related to subclinical anxiety levels and insular cortex activation. These findings reveal the neural mechanisms by which basic negative Pavlovian influences guide decision-making during planning, with implications for disrupted decision-making in psychiatric disorders.
View details for PubMedID 28924006
- Curricular Time, Patient Exposure, and Comfort Caring for Lesbian, Gay, Bisexual, and Transgender Patients Among Recent Medical Graduates. LGBT health 2017; 4 (3): 237-239
- Effect of rTMS on Resting-State Functional Connectivity in Patients with Major Depression ELSEVIER SCIENCE INC. 2017: S259
Neural Circuitry of Reward Prediction Error.
Annual review of neuroscience
Dopamine neurons facilitate learning by calculating reward prediction error, or the difference between expected and actual reward. Despite two decades of research, it remains unclear how dopamine neurons make this calculation. Here we review studies that tackle this problem from a diverse set of approaches, from anatomy to electrophysiology to computational modeling and behavior. Several patterns emerge from this synthesis: that dopamine neurons themselves calculate reward prediction error, rather than inherit it passively from upstream regions; that they combine multiple separate and redundant inputs, which are themselves interconnected in a dense recurrent network; and that despite the complexity of inputs, the output from dopamine neurons is remarkably homogeneous and robust. The more we study this simple arithmetic computation, the knottier it appears to be, suggesting a daunting (but stimulating) path ahead for neuroscience more generally. Expected final online publication date for the Annual Review of Neuroscience Volume 40 is July 8, 2017. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
View details for DOI 10.1146/annurev-neuro-072116-031109
View details for PubMedID 28441114
Trial and error.
2016; 354 (6316): 1108-1109
View details for PubMedID 27934726
Dopamine neurons share common response function for reward prediction error
2016; 19 (3): 479-?
Dopamine neurons are thought to signal reward prediction error, or the difference between actual and predicted reward. How dopamine neurons jointly encode this information, however, remains unclear. One possibility is that different neurons specialize in different aspects of prediction error; another is that each neuron calculates prediction error in the same way. We recorded from optogenetically identified dopamine neurons in the lateral ventral tegmental area (VTA) while mice performed classical conditioning tasks. Our tasks allowed us to determine the full prediction error functions of dopamine neurons and compare them to each other. We found marked homogeneity among individual dopamine neurons: their responses to both unexpected and expected rewards followed the same function, just scaled up or down. As a result, we were able to describe both individual and population responses using just two parameters. Such uniformity ensures robust information coding, allowing each dopamine neuron to contribute fully to the prediction error signal.
View details for DOI 10.1038/nn.4239
View details for Web of Science ID 000370822200020
View details for PubMedID 26854803
View details for PubMedCentralID PMC4767554
- Psychiatric Consultations in Less-Than-Private Places: Challenges and Unexpected Benefits of Hospital Roommates PSYCHOSOMATICS 2016; 57 (1): 97-101
Arithmetic and local circuitry underlying dopamine prediction errors
2015; 525 (7568): 243-?
Dopamine neurons are thought to facilitate learning by comparing actual and expected reward. Despite two decades of investigation, little is known about how this comparison is made. To determine how dopamine neurons calculate prediction error, we combined optogenetic manipulations with extracellular recordings in the ventral tegmental area while mice engaged in classical conditioning. Here we demonstrate, by manipulating the temporal expectation of reward, that dopamine neurons perform subtraction, a computation that is ideal for reinforcement learning but rarely observed in the brain. Furthermore, selectively exciting and inhibiting neighbouring GABA (γ-aminobutyric acid) neurons in the ventral tegmental area reveals that these neurons are a source of subtraction: they inhibit dopamine neurons when reward is expected, causally contributing to prediction-error calculations. Finally, bilaterally stimulating ventral tegmental area GABA neurons dramatically reduces anticipatory licking to conditioned odours, consistent with an important role for these neurons in reinforcement learning. Together, our results uncover the arithmetic and local circuitry underlying dopamine prediction errors.
View details for DOI 10.1038/nature14855
View details for Web of Science ID 000360927400037
View details for PubMedID 26322583
View details for PubMedCentralID PMC4567485
Interplay of approximate planning strategies
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
2015; 112 (10): 3098-3103
Humans routinely formulate plans in domains so complex that even the most powerful computers are taxed. To do so, they seem to avail themselves of many strategies and heuristics that efficiently simplify, approximate, and hierarchically decompose hard tasks into simpler subtasks. Theoretical and cognitive research has revealed several such strategies; however, little is known about their establishment, interaction, and efficiency. Here, we use model-based behavioral analysis to provide a detailed examination of the performance of human subjects in a moderately deep planning task. We find that subjects exploit the structure of the domain to establish subgoals in a way that achieves a nearly maximal reduction in the cost of computing values of choices, but then combine partial searches with greedy local steps to solve subtasks, and maladaptively prune the decision trees of subtasks in a reflexive manner upon encountering salient losses. Subjects come idiosyncratically to favor particular sequences of actions to achieve subgoals, creating novel complex actions or "options."
View details for DOI 10.1073/pnas.1414219112
View details for Web of Science ID 000350646500054
View details for PubMedID 25675480
View details for PubMedCentralID PMC4364207
Dopamine gates sensory representations in cortex
JOURNAL OF NEUROPHYSIOLOGY
2014; 111 (11): 2161-2163
The prefrontal cortex (PFC) maintains information about relevant sensory stimuli, in a process thought to rely on dopamine release. In a recent paper, Jacob et al. (J Neurosci 33: 13724-13734, 2013) demonstrated one way in which dopamine might facilitate this process. The authors recorded from PFC neurons in monkeys during local application of dopamine. They found that dopamine increases the gain of sensory-evoked responses in putative pyramidal neurons in PFC, potentially by inhibiting local interneurons.
View details for DOI 10.1152/jn.00795.2013
View details for Web of Science ID 000339171000001
View details for PubMedID 24401705
View details for PubMedCentralID PMC4097866
Division of Labor for Division: Inhibitory lnterneurons with Different Spatial Landscapes in the Olfactory System
2013; 80 (5): 1106-1109
Normalizing neural responses by the sum of population activity allows the nervous system to adjust its sensitivity according to task demands, facilitating intensity-invariant information processing. In this issue of Neuron, two studies, Kato et al. (2013) and Miyamichi et al. (2013), suggest that parvalbumin-positive interneurons in the olfactory bulb play a role in this process.
View details for DOI 10.1016/j.neuron.2013.11.013
View details for Web of Science ID 000327919500002
View details for PubMedID 24314722
View details for PubMedCentralID PMC4175561
Opening the black box: dopamine, predictions, and learning
TRENDS IN COGNITIVE SCIENCES
2013; 17 (9): 430-431
Dopamine neurons are thought to promote learning by signaling prediction errors, that is, the difference between actual and expected outcomes. Whether these signals are sufficient for associative learning, however, remains untested. A recent study used optogenetics in a classic behavioral paradigm to confirm the role of dopamine prediction errors in learning.
View details for DOI 10.1016/j.tics.2013.06.010
View details for Web of Science ID 000324784000002
View details for PubMedID 23830895
View details for PubMedCentralID PMC3811049
Role of prefrontal cortex and the midbrain dopamine system in working memory updating
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
2012; 109 (49): 19900-19909
Humans are adept at switching between goal-directed behaviors quickly and effectively. The prefrontal cortex (PFC) is thought to play a critical role by encoding, updating, and maintaining internal representations of task context in working memory. It has also been hypothesized that the encoding of context representations in PFC is regulated by phasic dopamine gating signals. Here we use multimodal methods to test these hypotheses. First we used functional MRI (fMRI) to identify regions of PFC associated with the representation of context in a working memory task. Next we used single-pulse transcranial magnetic stimulation (TMS), guided spatially by our fMRI findings and temporally by previous event-related EEG recordings, to disrupt context encoding while participants performed the same working memory task. We found that TMS pulses to the right dorsolateral PFC (DLPFC) immediately after context presentation, and well in advance of the response, adversely impacted context-dependent relative to context-independent responses. This finding causally implicates right DLPFC function in context encoding. Finally, using the same paradigm, we conducted high-resolution fMRI measurements in brainstem dopaminergic nuclei (ventral tegmental area and substantia nigra) and found phasic responses after presentation of context stimuli relative to other stimuli, consistent with the timing of a gating signal that regulates the encoding of representations in PFC. Furthermore, these responses were positively correlated with behavior, as well as with responses in the same region of right DLPFC targeted in the TMS experiment, lending support to the hypothesis that dopamine phasic signals regulate encoding, and thereby the updating, of context representations in PFC.
View details for DOI 10.1073/pnas.1116727109
View details for Web of Science ID 000312347200018
View details for PubMedID 23086162
View details for PubMedCentralID PMC3523834
Bonsai Trees in Your Head: How the Pavlovian System Sculpts Goal-Directed Choices by Pruning Decision Trees
PLOS COMPUTATIONAL BIOLOGY
2012; 8 (3)
When planning a series of actions, it is usually infeasible to consider all potential future sequences; instead, one must prune the decision tree. Provably optimal pruning is, however, still computationally ruinous and the specific approximations humans employ remain unknown. We designed a new sequential reinforcement-based task and showed that human subjects adopted a simple pruning strategy: during mental evaluation of a sequence of choices, they curtailed any further evaluation of a sequence as soon as they encountered a large loss. This pruning strategy was Pavlovian: it was reflexively evoked by large losses and persisted even when overwhelmingly counterproductive. It was also evident above and beyond loss aversion. We found that the tendency towards Pavlovian pruning was selectively predicted by the degree to which subjects exhibited sub-clinical mood disturbance, in accordance with theories that ascribe Pavlovian behavioural inhibition, via serotonin, a role in mood disorders. We conclude that Pavlovian behavioural inhibition shapes highly flexible, goal-directed choices in a manner that may be important for theories of decision-making in mood disorders.
View details for DOI 10.1371/journal.pcbi.1002410
View details for Web of Science ID 000302244000018
View details for PubMedID 22412360
View details for PubMedCentralID PMC3297555
Effects of parietal TMS on somatosensory judgments challenge interhemispheric rivalry accounts
2010; 48 (12): 3470-3481
Interplay between the cerebral hemispheres is vital for coordinating perception and behavior. One influential account holds that the hemispheres engage in rivalry, each inhibiting the other. In the somatosensory domain, a seminal paper claimed to demonstrate such interhemispheric rivalry, reporting improved tactile detection sensitivity on the right hand after transcranial magnetic stimulation (TMS) to the right parietal lobe (Seyal, Ro, & Rafal, 1995). Such improvement in tactile detection ipsilateral to TMS could follow from interhemispheric rivalry, if one assumes that TMS disrupted cortical processing under the coil and thereby released the other hemisphere from inhibition. Here we extended the study by Seyal et al. (1995) to determine the effects of right parietal TMS on tactile processing for either hand, rather than only the ipsilateral hand. We performed two experiments applying TMS in the context of median-nerve stimulation; one experiment required somatosensory detection, the second somatosensory intensity discrimination. We found different TMS effects on detection versus discrimination, but neither set of results followed the prediction from hemispheric rivalry that enhanced performance for one hand should invariably be associated with impaired performance for the other hand, and vice-versa. Our results argue against a strict rivalry interpretation, instead suggesting that parietal TMS can provide a pedestal-like increment in somatosensory response.
View details for DOI 10.1016/j.neuropsychologia.2010.07.031
View details for Web of Science ID 000284017300011
View details for PubMedID 20678510
View details for PubMedCentralID PMC2956832
Reward and Punishment Processing in Depression
2010; 68 (2): 118-124
Depression is a complex and heterogeneous disorder whose cause is poorly understood. Theories on the mechanisms of the disease have often focused on either its neurobiology or its cognitive and behavioral manifestations. Recently, studies exploring how depressed patients process reward and punishment have linked these two facets together. It has been suggested that individuals with a dysfunction in a specialized network of brain regions are unable to exploit affective information to guide behavior. Deficits in this ability might predispose such individuals to develop depression, whereas subsequent restoration of this ability--whether through pharmacological or behavioral treatments--might enable recovery from the disorder. Here we review behavioral, neuroimaging, and computational findings relevant to this hypothesis. There is good evidence that depressed patients exhibit abnormal behavioral responses to rewards and punishments and that these tendencies correspond to aberrant function in frontostriatal systems modulated by the monoamine systems. Furthermore, computational studies have generated testable predictions for how these neural signaling and neurochemical abnormalities might contribute to the symptoms of depression. Combining these approaches--as well as molecular and behavioral work in animals--provides great promise for furthering our understanding of this common and debilitating disease.
View details for DOI 10.1016/j.biopsych.2010.01.027
View details for Web of Science ID 000279900100002
View details for PubMedID 20303067
Interhemispheric Effect of Parietal TMS on Somatosensory Response Confirmed Directly with Concurrent TMS-fMRI
JOURNAL OF NEUROSCIENCE
2008; 28 (49): 13202-13208
Transcranial magnetic stimulation (TMS) has been used to document some apparent interhemispheric influences behaviorally, with TMS over the right parietal cortex reported to enhance processing of touch for the ipsilateral right hand (Seyal et al., 1995). However, the neural bases of such apparent interhemispheric influences from TMS remain unknown. Here, we studied this directly by combining TMS with concurrent functional magnetic resonance imaging (fMRI). We applied bursts of 10 Hz TMS over right parietal cortex, at a high or low intensity, during two sensory contexts: either without any other stimulation, or while participants received median nerve stimulation to the right wrist, which projects to left primary somatosensory cortex (SI). TMS to right parietal cortex affected the blood oxygenation level-dependent signal in left SI, with high- versus low-intensity TMS increasing the left SI signal during right-wrist somatosensory input, but decreasing this in the absence of somatosensory input. This state-dependent modulation of SI by parietal TMS over the other hemisphere was accompanied by a related pattern of TMS-induced influences in the thalamus, as revealed by region-of-interest analyses. A behavioral experiment confirmed that the same right parietal TMS protocol of 10 Hz bursts led to enhanced detection of perithreshold electrical stimulation of the right median nerve, which is initially processed in left SI. Our results confirm directly that TMS over right parietal cortex can affect processing in left SI of the other hemisphere, with rivalrous effects (possibly transcallosal) arising in the absence of somatosensory input, but facilitatory effects (possibly involving thalamic circuitry) in the presence of driving somatosensory input.
View details for DOI 10.1523/JNEUROSCI.3043-08.2008
View details for Web of Science ID 000261378100019
View details for PubMedID 19052211
View details for PubMedCentralID PMC2600426
Neural substrates of choice selection in adults and adolescents: Development of the ventrolateral prefrontal and anterior cingulate cortices
2007; 45 (6): 1270-1279
A heightened propensity for risk-taking and poor decision-making underlies the peak morbidity and mortality rates reported during adolescence. Delayed maturation of cortical structures during the adolescent years has been proposed as a possible explanation for this observation. Here, we test the hypothesis of adolescent delayed maturation by using fMRI during a monetary decision-making task that directly examines risk-taking behavior during choice selection. Orbitofrontal/ventrolateral prefrontal cortex (OFC/VLPFC) and dorsal anterior cingulate cortex (ACC) were examined selectively since both have been implicated in reward-related processes, cognitive control, and resolution of conflicting decisions. Group comparisons revealed greater activation in the OFC/VLPFC (BA 47) and dorsal ACC (BA 32) in adults than adolescents when making risky selections. Furthermore, reduced activity in these areas correlated with greater risk-taking performance in adolescents and in the combined group. Consistent with predictions, these results suggest that adolescents engage prefrontal regulatory structures to a lesser extent than adults when making risky economic choices.
View details for DOI 10.1016/j.neuropsychologia.2006.10.004
View details for Web of Science ID 000245130900013
View details for PubMedID 17118409
View details for PubMedCentralID PMC2700731
Responsive parenting: interventions and outcomes
BULLETIN OF THE WORLD HEALTH ORGANIZATION
2006; 84 (12): 991-998
In addition to food, sanitation and access to health facilities children require adequate care at home for survival and optimal development. Responsiveness, a mother's/caregiver's prompt, contingent and appropriate interaction with the child, is a vital parenting tool with wide-ranging benefits for the child, from better cognitive and psychosocial development to protection from disease and mortality. We examined two facets of responsive parenting -- its role in child health and development and the effectiveness of interventions to enhance it -- by conducting a systematic review of literature from both developed and developing countries. Our results revealed that interventions are effective in enhancing maternal responsiveness, resulting in better child health and development, especially for the neediest populations. Since these interventions were feasible even in poor settings, they have great potential in helping us achieve the Millennium Development Goals. We suggest that responsiveness interventions be integrated into child survival strategies.
View details for Web of Science ID 000242431900014
View details for PubMedID 17242836
View details for PubMedCentralID PMC2627571
Behavioral predictors of substance-use initiation in adolescents with and without attention-deficit/hyperactivity disorder
2006; 117 (6): 2030-2039
Our goal was to examine substance-use initiation in healthy adolescents and in adolescents who have been diagnosed with attention-deficit/hyperactivity disorder.Seventy-eight adolescents (28 healthy and 50 with attention-deficit/hyperactivity disorder) participated in an ongoing longitudinal study of predictors of substance use. The substances most commonly reported were tobacco, alcohol, and marijuana. Aggression, conduct problems, hyperactivity, impulsivity, inattention, anxiety/depression, social difficulties, and somatic complaints were assessed at study entry and tested as predictors for later substance use.With an average of 4 years into the study, 37 adolescents had not used any substances, 41 had experimented with at least 1 substance, and 29 experimented with >1 substance. Psychiatric diagnoses (attention-deficit/hyperactivity disorder, attention-deficit/hyperactivity disorder and conduct disorder, and attention-deficit/hyperactivity disorder and depression/anxiety) did not influence reports of substance use. Distinct behavioral measures collected at study entry predicted use of different substances. In a multivariate analysis, aggression had the greatest association with tobacco smoking and marijuana use. Impulsivity was associated with alcohol use. Severity of drug exposure, indexed by the number of substances used, was predicted by aggression.This 4-year longitudinal study captured the onset of substance use, not abuse. Behavioral predictors differed with the type of substance used. These behavioral characteristics may raise suspicion among pediatricians for enhanced risk for substance-use initiation.
View details for DOI 10.1542/peds.2005-0704
View details for Web of Science ID 000237979000021
View details for PubMedID 16740845
Reward-related processes in pediatric bipolar disorder: a pilot study
JOURNAL OF AFFECTIVE DISORDERS
2004; 82: S89-S101
Neuropsychological research on children with bipolar disorder (BPD) is scarce. Here, we examine reward-related behaviors in children with BPD using a Wheel of Fortune task in which subjects could win or lose money depending on their decisions. The intent of this work was to investigate performance differences between BPD and healthy children on a task that could be used in an fMRI environment to inform the neural substrates of reward processes in BPD. This study has no direct clinical implications. We hypothesized that relative to healthy children, children with BPD would select risky options more frequently, be less confident in a favorable outcome, and report stronger emotional responses to outcomes.Forty-four children (22 BPD; 22 control) were compared on (i) decision-making with varying levels of risk, (ii) level of confidence in favorable outcomes, and (iii) responses to feedback. The task included a win-no win version and a lose-no lose version.Patterns of selection did not differ between groups. In the lose-no lose task, BPD patients were less confident than controls in favorable outcomes. BPD patients expressed greater dissatisfaction than controls at not winning in win-no win, and greater satisfaction than controls at not losing in lose-no lose.Limitations of this study included that the children with BPD were mostly in a depressed state, were medicated, and had co-morbid disorders.This is the first experimental study to examine associations between pediatric BPD and reward-related behaviors. Although we failed to detect abnormalities in risky decision-making in children with BPD, we found significant differences between groups in both confidence ratings and response to feedback, consistent with our predictions. Our ultimate goal is to use this task in the fMRI environment to gain a better understanding of the neural correlates of reward-related processes in pediatric BPD.
View details for DOI 10.1016/j.jad.2004.05.022
View details for Web of Science ID 000226117600010
View details for PubMedID 15571794
Choice selection and reward anticipation: an fMRI study
2004; 42 (12): 1585-1597
We examined neural activations during decision-making using fMRI paired with the wheel of fortune task, a newly developed two-choice decision-making task with probabilistic monetary gains. In particular, we assessed the impact of high-reward/risk events relative to low-reward/risk events on neural activations during choice selection and during reward anticipation. Seventeen healthy adults completed the study. We found, in line with predictions, that (i) the selection phase predominantly recruited regions involved in visuo-spatial attention (occipito-parietal pathway), conflict (anterior cingulate), manipulation of quantities (parietal cortex), and preparation for action (premotor area), whereas the anticipation phase prominently recruited regions engaged in reward processes (ventral striatum); and (ii) high-reward/risk conditions relative to low-reward/risk conditions were associated with a greater neural response in ventral striatum during selection, though not during anticipation. Following an a priori ROI analysis focused on orbitofrontal cortex, we observed orbitofrontal cortex activation (BA 11 and 47) during selection (particularly to high-risk/reward options), and to a more limited degree, during anticipation. These findings support the notion that (1) distinct, although overlapping, pathways subserve the processes of selection and anticipation in a two-choice task of probabilistic monetary reward; (2) taking a risk and awaiting the consequence of a risky decision seem to affect neural activity differently in selection and anticipation; and thus (3) common structures, including the ventral striatum, are modulated differently by risk/reward during selection and anticipation.
View details for DOI 10.1016/j.neuropsychologia.2004.05.011
View details for Web of Science ID 000224047600001
View details for PubMedID 15327927