Running Title: ENCODING-RETRIEVAL ... - Semantic Scholar

2 downloads 153 Views 5MB Size Report
Running Title: ENCODING-RETRIEVAL SIMILARITY AND MEMORY. Neural similarity between encoding and retrieval is related to
!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

3/ /

Running Title: ENCODING-RETRIEVAL SIMILARITY AND MEMORY

Neural similarity between encoding and retrieval is related to memory via hippocampal interactions

Maureen Ritchey, Erik A. Wing, Kevin S. LaBar, and Roberto Cabeza Center for Cognitive Neuroscience and Department of Psychology and Neuroscience, Duke University, Box 90999, Durham, NC, 27708

Cerebral Cortex, in press

Corresponding Author: Maureen Ritchey [email protected] Phone: 530-757-8865 Fax: 530-757-8640 Center for Neuroscience 1544 Newton Ct. Davis, CA 95618

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

4/ /

Abstract A fundamental principle in memory research is that memory is a function of the similarity between encoding and retrieval operations. Consistent with this principle, many neurobiological models of declarative memory assume that memory traces are stored in cortical regions and the hippocampus facilitates reactivation of these traces during retrieval. The present investigation tested the novel prediction that encoding-retrieval similarity can be observed and related to memory at the level of individual items. Multivariate representational similarity analysis was applied to functional magnetic resonance imaging (fMRI) data collected during encoding and retrieval of emotional and neutral scenes. Memory success tracked fluctuations in encoding-retrieval similarity across frontal and posterior cortices. Importantly, memory effects in posterior regions reflected increased similarity between item-specific representations during successful recognition. Mediation analyses revealed that the hippocampus mediated the link between cortical similarity and memory success, providing crucial evidence for hippocampal-cortical interactions during retrieval. Finally, because emotional arousal is known to modulate both perceptual and memory processes, similarity effects were compared for emotional and neutral scenes. Emotional arousal was associated with enhanced similarity between encoding and retrieval patterns. These findings speak to the promise of pattern similarity measures for evaluating memory representations and hippocampal-cortical interactions. Keywords: emotional memory, episodic memory, functional neuroimaging, multivariate pattern analysis

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

5/ /

Memory retrieval involves the reactivation of neural states similar to those experienced during initial encoding. The strength of these memories is thought to vary as a function of encoding-retrieval match (Tulving and Thomson 1973), with stronger memories being associated with greater correspondence. This relationship has been formalized in models linking recognition memory success to the quantitative similarity of event features sampled during encoding and retrieval (Bower 1972), as well as in the principle of transfer appropriate processing, which proposes that memory will be enhanced when the cognitive operations engaged during encoding are related to those supporting memory discrimination at retrieval (Morris et al. 1977). Encoding-retrieval similarity also bears special relevance to neurocomputational models that posit a role for the hippocampus in guiding replay of prior learning events across neocortex (Alvarez and Squire 1994; McClelland et al. 1995; Nadel et al. 2000; Sutherland and McNaughton 2000; Norman and O'Reilly 2003). In these models, the hippocampus binds cortical representations associated with the initial learning experience, and then facilitates their reactivation during the time of retrieval. Prior attempts to measure encoding-retrieval match have focused on identifying differences in hemodynamic responses to retrieval sets that vary only in their encoding history, such as prior association with a task, context, or stimulus (reviewed by Danker and Anderson 2010). For example, when retrieval cues are held constant, words studied with visual images elicit greater activity in visual cortex at retrieval than words studied with sounds (Nyberg et al. 2000; Wheeler et al. 2000), consistent with the reactivation of their associates. However, memory often involves discriminating between old and new stimuli of the same kind. Within the context of recognition memory, transfer-appropriate processing will be most useful when it involves recapitulation of cognitive and perceptual processes that are uniquely linked to individual items and differentiate them from lures (Morris et al. 1977). Because previous studies have compared sets of items with a shared encoding history, their measures of processing overlap have overlooked those operations that are idiosyncratic to each individual stimulus. The current experiment makes the important advance of measuring the neural similarity between encoding and recognition of

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

6/ /

individual scenes (e.g., an image of a mountain lake versus other scene images). By linking similarity to memory performance, we aim to identify regions in which neural pattern similarity to encoding is associated with retrieval success, likely arising from their participation in operations whose recapitulation benefits memory. These operations may be limited to perceptual processes supporting scene and object recognition, evident along occipitotemporal pathways, or may include higher-order processes reliant on frontal and parietal cortices. Item-specific estimates also facilitate new opportunities for linking encoding-retrieval overlaps to the function of the hippocampus. One possibility is that the hippocampus mediates the link between neural similarity and behavioral expressions of memory. The relation between the hippocampus and cortical reactivation is central to neurocomputational models of memory, which predict enhanced hippocampal-neocortical coupling during successful memory retrieval (Sutherland and McNaughton 2000; Wiltgen et al. 2004; O'Neill et al. 2010). Although these ideas have been supported by neurophysiological data (Pennartz et al. 2004; Ji and Wilson 2007), it has been a challenge to test this hypothesis in humans. To this end, the present approach newly enables analysis of how the hippocampus mediates the relationship between encoding-retrieval pattern similarity and memory retrieval on a trial-totrial basis. Finally, the neural similarity between encoding and retrieval should be sensitive to experimental manipulations that modulate perception and hippocampal memory function. There is extensive evidence that emotional arousal increases the strength and vividness of declarative memories, thought to arise from its influence on encoding and consolidation processes (LaBar and Cabeza 2006; Kensinger 2009). Although the neural effects of emotion on encoding (Murty et al. 2010) and retrieval (Buchanan 2007) phases have been characterized separately, they have seldom been directly compared (but see Hofstetter et al. 2012). Emotional stimuli elicit superior perceptual processing during encoding (Dolan and Vuilleumier 2003), which may result in perceptually rich memory traces that can be effectively

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

7/ /

recaptured during item recognition. Furthermore, arousal-related noradrenergic and glucocorticoid responses modulate memory consolidation processes (McGaugh 2004; LaBar and Cabeza 2006), which are thought to rely on hippocampal-cortical interactions similar to those supporting memory reactivation during retrieval (Sutherland and McNaughton 2000; Dupret et al. 2010; O'Neill et al. 2010; Carr et al. 2011). One possibility is that the influence of emotion on consolidation is paralleled by changes in encoding-retrieval similarity during retrieval. We test the novel hypothesis that emotion, through its influence on perceptual encoding and/or hippocampal-cortical interactions, may be associated with increased pattern similarity during retrieval. Here, we use event-related functional magnetic resonance imaging (fMRI), in combination with multivariate pattern similarity analysis, to evaluate the neural similarity between individual scenes at encoding and retrieval. Across several regions of interest, we calculated the neural pattern similarity between encoding and retrieval trials, matching individual trials to their identical counterparts (item-level pairs) or to other stimuli drawn from the same emotional valence, encoding condition, and memory status (set-level pairs; Figure 1). To assess evidence for item-specific fluctuations in neural similarity, we specifically sought regions showing a difference between remembered and forgotten items that were augmented for item-level pairs relative to set-level pairs. Given the recognition design employed for retrieval, we anticipated that evidence for item-specific similarity would be especially pronounced in regions typically associated with visual processing. These techniques yielded estimates of neural pattern similarity for each individual trial, newly enabling a direct test of the hypothesis that the hippocampus mediates the relationship between cortical similarity and behavioral expressions of memory. Finally, we tested the novel prediction that emotion enhances cortical similarity by comparing these effects for emotionally arousing versus neutral scenes. Methods Participants

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

8/ /

Twenty-one participants completed the experiment. Two participants were excluded from analysis: one due to excessive motion and one due to image artifacts affecting the retrieval scans. This resulted in 19 participants (9 female), ranging in age from 18 to 29 (M = 23.3, s.d. = 3.1). Participants were healthy, right-handed, native English speakers, with no disclosed history of neurological or psychiatric episodes. Participants gave written informed consent for a protocol approved by the Duke University Institutional Review Board. Experimental Design Participants were scanned during separate memory encoding and recognition sessions, set 2 days apart. During the first session, they viewed 420 complex visual scenes for 2 seconds each. Following each scene, they made an emotional arousal rating on a 4-point scale and answered a question related to the semantic or perceptual features of the image. During the second session, participants saw all of the old scenes randomly intermixed with 210 new scenes for 3 seconds each. For each trial, they rated whether or not the image was old or new on a 5-point scale, with response options for “definitely new,” “probably new,” “probably old,” “definitely old,” and “recollected.” The 5th response rating referred to those instances in which they were able to recall a specific detail from when they had seen that image before. For all analyses, memory success was assessed by collapsing the 4th and 5th responses, defining those items as “remembered,” and comparing them to the other responses, referred to here as “forgotten.” This division ensured that sufficient numbers of trials (i.e., more than 15) were included as remembered and forgotten. In both sessions, trials were separated by a jittered fixation interval, exponentially distributed with a mean of 2 s. The stimuli consisted of a heterogeneous set of complex visual scenes drawn from the International Affective Picture System (Lang et al. 2001) as well as in-house sources. They included 210 emotionally negative images (low in valence and high in arousal, based on normative ratings), 210 emotionally positive images (high in valence and high in arousal), and 210 neutral images (midlevel in

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

9/ /

valence and low in arousal). The influence of arousal was assessed by comparing both negative and positive images to neutral. Additional details about the encoding design and stimulus set are described in Ritchey et al. (2011), which reports subsequent memory analyses of the encoding data alone. FMRI Acquisition & Pre-Processing Images were collected using a 4T GE scanner, with separate sessions for encoding and retrieval. Stimuli were presented using liquid crystal display goggles, and behavioral responses were recorded using a four-button fiber optic response box. Scanner noise was reduced with earplugs and head motion was minimized using foam pads and a headband. Anatomical scanning started with a T2-weighted sagittal localizer series. The anterior (AC) and posterior commissures (PC) were identified in the midsagittal slice, and 34 contiguous oblique slices were prescribed parallel to the AC-PC plane. Functional images were acquired using an inverse spiral sequence with a 2-sec TR, a 31-msec TE, a 24-cm FOV, a 642 matrix, and a 60° flip angle. Slice thickness was 3.8 mm, resulting in 3.75 x 3.75 x 3.8 mm voxels. Preprocessing and data analyses were performed using SPM5 software implemented in Matlab (www.fil.ion.ucl.ac.uk/spm). After discarding the first 6 volumes, the functional images were slice-timing corrected, motion-corrected, and spatially normalized to the Montreal Neurological Institute (MNI) template. Functional data from the retrieval session were aligned with data from the encoding session, and normalization parameters were derived from the first functional from the encoding session. Data were spatially smoothed with an 8-mm isotropic Gaussian kernel for univariate analyses and left unsmoothed for multivariate pattern analyses. FMRI Analysis General Linear Model. All multivariate pattern analyses were based on a general linear model that estimated individual trials. General linear models with regressors for each individual trial were estimated separately for encoding and retrieval, yielding one model with a beta image corresponding to

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

:/ /

encoding trial and another model with a beta image corresponding to each retrieval trial. The post-image encoding ratings were also modeled and combined into a single regressor; thus, all analyses reflect the picture presentation time only. Regressors indexing head motion were also included in the model. The validity of modeling individual trials has been demonstrated previously (Rissman et al. 2004), and the comparability of these models to more traditional methods was likewise confirmed within this dataset. Additional general linear models were estimated for univariate analysis of the retrieval phase, which were used to define functional regions of interest (ROIs) for the mediation analyses (described below). The models were similar to that described above, but collapsed the individual trials into separate regressors for each emotion type, as in standard analysis approaches. These models also included parametric regressors indexing the 5-point subsequent memory response for old items, as well as a single regressor for all new items. Contrasts corresponding to the effects of memory success, emotion, and the emotion by memory interaction were generated for each individual. Univariate analysis procedures and results are described in greater detail in the Supplementary Materials. Multivariate encoding-retrieval similarity analysis. A series of 31 bilateral anatomical ROIs were generated from the AAL system (Tzourio-Mazoyer et al. 2002) implemented in WFU Pickatlas (Maldjian et al. 2003). We expected that encoding-retrieval similarity effects could arise not only from the reactivation of perceptual processes, likely to be observed in occipital and temporal cortices, but also higher-order cognitive processes involving the lateral frontal and parietal cortices. Thus, the ROIs included all regions comprised in the occipital, temporal, lateral prefrontal, and parietal cortices (Table 1). Regions within the medial temporal lobe were also included due to their known roles in memory and emotional processing. Two of these regions (olfactory cortex and Heschl’s gyrus) were chosen as primary sensory regions that are unlikely to be responsive to experimental factors, and thus they serve as conceptual controls. Bilateral ROIs were chosen to limit the number of comparisons while observing effects across a broad set of regions, and reported effects were corrected for multiple comparisons.

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/

;/ /

The full pattern of voxel data from a given ROI was extracted from each beta image corresponding to each trial, yielding a vector for each trial Ei at encoding and each trial Ri at retrieval (Figure 1). Each of the vectors was normalized with respect to the mean activity and standard deviation within each trial, thus isolating the relative pattern of activity for each trial and ROI. Euclidean distance was computed for all possible pair-wise combinations of encoding and retrieval target trials, resulting in a single distance value for each EiRj pair. This distance value measures the degree of dissimilarity between activity patterns at encoding and those at retrieval—literally the numerical distance between the patterns plotted in n-dimensional space, where n refers to the number of included voxels (Kriegeskorte 2008). An alternative metric for pattern similarity is the Pearson’s correlation coefficient, which is highly anticorrelated with Euclidean distance; not surprisingly, analyses using this measure replicate the findings reported here. For ease of understanding, figures report the inverse of Euclidean distance as a similarity metric; regression analyses likewise flip the sign of distance metric z-scores to reflect similarity. Pairwise distances were summarized according to whether or not the encoding-retrieval pair corresponded to an item-level match between encoding and retrieval (i.e., trial Ei refers to the same picture as trial Ri). In comparison, set-level pairs were matched on the basis of emotion type, encoding task, and memory status, but excluded these item-level pairs. These experimental factors were controlled in the set-level pairs to mitigate the influence of task engagement or success. Any differences observed between the item- and set-level pair distances can be attributed to reactivation of memories or processes specific to the individual stimulus. On the other hand, any memory effects observed across both the itemand set-level pair distances may reflect processes general to encoding and recognizing scene stimuli. To assess neural similarity, mean distances for the item- and set-level match pairs were entered into a three-way repeated-measures ANOVA with match level (item-level, set-level), memory status (remembered, forgotten), and emotion (negative, neutral, positive) as factors. We first considered the effects of match and memory irrespective of emotion. Reported results were significant at a one-tailed

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/ 3 .05. Mediation analysis showed that amygdala activity did not mediate the relationship between encoding-retrieval similarity and memory, or vice versa (Table 3), indicating that the hippocampus remains the primary link between cortical similarity and memory. Altogether, these findings suggest that emotional arousal is associated with heightened encoding-retrieval similarity, particularly in posterior cortical regions, and recapitulation of negative information tends to engage the amygdala during successful retrieval. Discussion These results provide novel evidence that successful memory is associated with superior match between encoding and retrieval at the level of individual items. Furthermore, hippocampal involvement during retrieval mediates this relationship on a trial-to-trial basis, lending empirical support to theories positing dynamic interactions between hippocampal and neocortex during retrieval. These findings generalize to memories associated with emotional arousal, which magnifies the similarity between encoding and retrieval patterns. Neural evidence for encoding-retrieval similarity Prior reports have highlighted the reactivation of set-level information from the initial learning episode during memory retrieval, marked by spatial overlap between task-related hemodynamic responses at each phase (reviewed by Rugg et al. 2008; Danker and Anderson 2010). Multivariate pattern analysis (MVPA) methods have advanced these findings by demonstrating that neural network classifiers, trained to associate experimental conditions with hemodynamic patterns from encoding, can generate predictions about which type of information is reactivated at retrieval (reviewed by Rissman and Wagner 2012), including patterns associated with the memorandum’s category (Polyn et al. 2005), its encoding task (Johnson et al. 2009), or its paired associate (Kuhl et al. 2011). Importantly, despite these methodological improvements, evidence thus far has been limited to broad set-level distinctions tied to the encoding

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/ 39/ /

manipulation. Overlaps between encoding and retrieval will most benefit discrimination between old and new items when they carry information specific to individual stimuli. The present study advances this line of research by providing essential evidence that the similarity between encoding and retrieval operations can be tracked at the level of individual items. For some regions, especially in occipitotemporal cortices, encoding-retrieval similarity is most affected by memory success when trials are matched at the level of individual items. The presence of this interaction provides direct evidence for the theory of encoding-retrieval match, in that increased similarity is associated with superior memory performance. Item similarity may reflect cognitive and perceptual operations particular to the individual stimulus that occur during both encoding and retrieval or the reactivation of processes or information associated with initial encoding. Both forms of information may serve to induce brain states at retrieval that more closely resemble the brain states of their encoding counterparts. Unlike previously reported results, these measures are not constrained by category or task and are therefore ideal for flexibly capturing instances of encoding-retrieval match tailored to each individual person and trial. In many regions spanning frontal, parietal, and occipital cortices, encodingretrieval similarity predicted memory success when mean activation did not, suggesting that in these regions, pattern effects may be more informative than univariate estimates of activation. Recent investigations using pattern similarity analysis have related variation across encoding trials to memory success, exploiting the ability of similarity measures to tap into the representational structure underlying neural activity (Kriegeskorte 2008). Jenkins and Ranganath (2010) used pattern distance to link contextual shifts during encoding to successful memory formation. Another set of experiments demonstrated that increased similarity between encoding repetitions, calculated separately for each individual stimulus, predicts memory success, suggesting that memories benefit from consistency across learning trials (Xue et al. 2010). In contrast, the present study focuses on the similarities between processing the same mnemonic stimuli at two separate phases of memory—namely,

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/ 3:/ /

the overlap between encoding and explicit recognition processes. Because a different task is required at each phase, the similarity between encoding and retrieval trials likely arises from either perceptual attributes, which are common between remembered and forgotten trials, or from information that aids or arises from mnemonic recovery, which differ between remembered and forgotten trials. Thus, any differences between remembered and forgotten items in the item-level pairs should reflect item-specific processes that benefit memory or the reactivation of encoding-related information during the recognition period. Hippocampal-cortical interactions during successful memory retrieval Because pattern similarity measures yield item-specific estimates of encoding-retrieval overlap, they offer new opportunities to investigate the role of the hippocampus in supporting neocortical memory representations. Novel analyses relating trial-by-trial fluctuations in cortical pattern similarity to hippocampal retrieval responses revealed that the hippocampus partially mediates the link between encoding-retrieval similarity and retrieval success across a number of neocortical ROIs. Many theories of memory have posited dynamic interactions between the hippocampus and neocortex during episodic retrieval (Alvarez and Squire 1994; McClelland et al. 1995; Sutherland and McNaughton 2000; Norman and O'Reilly 2003), such that hippocampal activation may be triggered by the overlap between encoding and retrieval representations or may itself promote neocortical pattern completion processes. The mediation models tested here suggest some combination of both mechanisms. However, evidence for the hippocampal mediation account tended to be stronger and distributed across more regions, possibly stemming from our use of a recognition design. Interestingly, in our mediation models hippocampal activation did not fully account for the link between encoding-retrieval similarity and memory, implying that both hippocampal and neocortical representations are related to memory retrieval (McClelland et al. 1995; Wiltgen et al. 2004). Implications of emotional modulation of cortical pattern similarity

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/ 3;/ /

Emotion is known to modulate factors relevant to encoding-retrieval similarity, such as perceptual encoding (Dolan and Vuilleumier 2003) and memory consolidation (McGaugh 2004), in addition to enhancing hippocampal responses during both encoding (Murty et al. 2010) and retrieval (Buchanan 2007). However, its influence on encoding-retrieval match has never before been investigated. Here we present novel evidence that, across all trials, emotion heightens the similarity between encoding and retrieval patterns in several regions in the occipital, temporal, and parietal cortices. The relationship between encoding-retrieval similarity and memory was consistent across levels of emotion, suggesting that memory-related pattern similarity is affected by emotion in an additive rather than interactive way. One possible explanation for these findings is that emotion might generally increase the likelihood or strength of hippocampal-cortical interactions that contribute to both consolidation and retrieval (Carr et al. 2011). Another possible explanation is that emotion might facilitate perceptual processing during both encoding and retrieval, thus resulting in heightened similarity between encoding and retrieval of emotional items, especially among regions devoted to visual perception. Consistent with the latter interpretation, the amygdala appears to modulate emotional scene processing in extrastriate regions including middle occipital gyrus but not in early visual areas like the calcarine sulcus (Sabatinelli et al. 2009). Functional connectivity analyses revealed that neural similarity in occipital cortex correlated with amygdala activity during successfully-retrieved negative trials. Negative memories are thought to be particularly sensitive to perceptual information (reviewed by Kensinger 2009), and enhanced memory for negative visual details has been tied to the amygdala and its functional connectivity with late visual regions (Kensinger et al. 2007). Increased reliance on perceptual information, along with observed memory advantages for negative relative to positive stimuli, may account for these valence-specific effects. Unlike the hippocampus, however, the amygdala did not mediate the link between similarity and memory, suggesting that amygdala responses during rapid scene recognition may be driven by the recovery and processing of item-specific details. These observations stand in contrast to results obtained

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/ 4Y X->Y X->Y (b) (c') (c)

Mean X-M correlation mediation term (a*b)

rem

forg

E-R similarity (X) to memory (Y), mediated by left HC activity at retrieval (M) Mid Occipital

.104*

.115*

.091*

.102*

.009*

.132*

.069*

Sup Occipital

.089*

.119*

.064*

.073*

.008*

.100*

.084 *

Inf Temporal

.049

.120*

.054*

.058*

.004

.051

.050

Mid Temporal

.022

.122*

.073*

.074*

.002

.031

.005

Inf Frontal Opercularis

.087*

.117*

.061*

.070*

.007*

.099*

.065*

Inf Frontal Triangularis

.090*

.114*

.077*

.086*

.007*

.111*

.064*

Inf Parietal

.040*

.122*

.082*

.085*

.004*

.047

.011

Supramarginal

-.005

.123*

.070*

.069*

-.010

.003

-.033

Left HC activity at retrieval (X) to memory (Y), mediated by E-R similarity (M) Mid Occipital

.110*

.091*

.115*

.124*

.005*

-

-

Sup Occipital

.093*

.064*

.119*

.124*

.003*

-

-

Inf Temporal

.051

.054*

.120*

.123*

.000

-

-

Mid Temporal

.023

.073*

.122*

.123*

.000

-

-

Inf Frontal Opercularis

.091*

.061*

.117*

.124*

.003

-

-

Inf Frontal Triangularis

.095*

.077*

.114*

.123*

.005

-

-

Inf Parietal

.040*

.082*

.122*

.123*

.001

-

-

Supramarginal

-.005

.070*

.123*

.123*

.000

-

-

.070

.046

E-R similarity (X) to memory (Y), mediated by right amygdala activity at retrieval (M) Mid Occipital

.044

.036*

.103*

.103*

.000

Right amygdala activity at retrieval (X) to memory (Y), mediated by E-R similarity (M) Mid Occipital .046 .103* .036* .039* .002 Note: * denote regions significant at Bonferroni-corrected threshold. Inf = Inferior, Mid = Middle, Sup = Superior, rem = remembered, forg = forgotten, E-R = encoding-retrieval, HC = hippocampus.

!"#$%&"'()*+)&*,-./0&1&.-)&+2/-"%/1*1$)2/ 56/ /

Table 4. Regions showing emotional modulation of encoding-retrieval similarity Match x Memory x Emotion ANOVA Effect

Region

F(2, 36)

p

Mid Occipital

19.45

0.000*

Inf Temporal

17.61

0.000*

Inf Occipital

16.46

0.000*

Angular

15.38

0.000*

Inf Parietal

12.43

0.000*

Supramarginal

9.05

0.000*

Mid Temporal

8.29

0.001*

Sup Occipital

6.51

0.002*

Sup Temporal

5.11

0.006

Sup Parietal

4.95

0.006

Inf Frontal Opercularis

4.92

0.006

Fusiform

4.26

0.011

Sup Frontal

4.13

0.012

Inf Frontal Triangularis

3.88

0.015

Mid Occipital

6.31

0.002*

Fusiform

3.41

0.022

3.85

0.015

Main effect of Emotion

Emotion x Match Interaction

Emotion x Memory Interaction Inf Frontal Opercularis Emotion x Match x Memory Interaction Fusiform 3.52 0.020 Note: Asterisks denote regions significant at Bonferroni-corrected threshold. Regions showing marginal effects (p < .025) are also shown. Main effects and interactions not involving emotion are presented separately in Table 1. Inf = Inferior, Mid = Middle, Sup = Superior.







 

  

 

 

    

  





      



   

  

           !     

            

"

 

'(  #%$

#%"

# $

# "

 

#

&#

#

#"$

#

"  $"

"#"))

"#")"

# "#")-



"#")%

'(

'(





"#")

'(

  

"#"),

"#")+

"#")*

'(





 



     





!





"

 

 #!  #"  $!

 

 



  

  

   



 





   

    

 



 



 





   

'!

!  

 

 

! "&$

         

! "& ! "%$ ! "% ! "#$







 

    

!

  

     







 

Supplementary Methods: Univariate Analyses Univariate fMRI analyses were based on general linear models implemented in SPM5 (www.fil.ion.ucl.ac.uk/spm). For each subject, evoked hemodynamic responses to event types were modeled with a stick function corresponding to stimulus presentation convolved with a canonical hemodynamic response function. Encoding and retrieval phases were modeled separately. Both models included six main event types, representing all possible combinations of emotion (negative, neutral, positive) and encoding task (deep, shallow). In the encoding model, the post-image encoding ratings were also modeled, combined into a single regressor; thus, all analyses reflect the picture presentation time only. In the retrieval model, new items were also modeled with a single regressor. Confounding factors (head motion, magnetic field drift) were included in each model. In each model, linear parametric regressors indexing the 5-point memory response (1 = definitely new, 2 = maybe new, 3 = maybe old, 4 = definitely old, and 5 = remember) were included for each of the 6 main event types. These regressors were used to identify regions whose activity increased as function of recognition response; since the main event types included only studied items, this can be taken as encoding success (ES) or retrieval success (RS) activity. Estimates for the ES and RS regressors were generated for each participant, collapsing across deep and shallow encoding tasks, and then entered into a group-level, repeated-measures ANOVA with factors for memory phase (encoding, retrieval) and emotion (negative, neutral, positive). The main effects of memory success (all ES and RS regressors versus implicit baseline), phase, and emotion were evaluated, as well as the interaction of emotion and phase. The main effect of memory success was exclusively masked with all other effects at p < .05 to ensure that it was not qualified by higher-order effects; likewise, the main effects of phase and emotion were exclusively masked with their interaction. Whole-brain results are reported at p < .001 with a cluster extent of 10 voxels (Table S1). For the main effect of emotion, results within the amygdala are reported at p < .005, with an extent of 5 voxels.

Supplementary Results: Univariate Analyses Consistent with previous findings that ES and RS are supported by overlapping networks of regions (Spaniol et al. 2009), the main effect of memory success resulted in a broad set of regions including clusters spanning the medial temporal lobes, amygdala, the ventral visual stream, and bilateral inferior frontal gyrus (Table S1). There were also a number of regions that showed a negative relationship with memory success (denoted in the direction column in Table S1), including precuneus, lateral parietal, and dorsolateral prefrontal cortices. Because this effect was exclusively masked with other main effects and interactions, memory-related activity in these regions was not significantly affected by emotion or phase. There were a number of regions exhibiting a main effect of phase, including medial prefrontal cortex, posterior midline regions, and lateral parietal cortex. These regions were associated with memory success during retrieval more so than during encoding. These regions, particularly the posterior midline, show reliable reversal in their activation effects during encoding versus retrieval (Daselaar et al. 2009; Huijbers et al. 2011), perhaps due to their role in orienting to mnemonic information that aids retrieval but disrupts encoding. These results extend those findings by showing that these flips are relatively stable across negative, neutral, and positive emotion types. Interestingly, although the overall magnitude of activation in these regions predicted memory differently during encoding versus retrieval, the multivariate pattern results suggest a more complex story. In some of these ROIs, including supramarginal gyrus, inferior parietal, and precuneus, the local patterns were relatively similar between encoding and retrieval trials, and this similarity also predicted memory. Thus, although overall activation effects appeared rather different between encoding and retrieval in these regions, multivariate patterns at the individual trial level suggest a stability of representation across both phases that predicted memory. Not surprisingly, the amygdala showed a main effect of emotion, in that activation in this region (along with the middle temporal gyrus) was stronger for negative and positive relative to

neutral items. This effect did not interact with phase, indicating that the amygdala is associated with both ES and RS. Although similar findings have been reported separately for encoding (Murty et al. 2010) and retrieval (Buchanan 2007), evidence for consistent amygdala participation across both phases has been sparse. This finding verifies that the amygdala promotes memory success during both encoding and retrieval and that the loci of these effects are overlapping. In fact, in the present study, no regions showed a significant interaction of emotion and phase, even at the more liberal threshold of p < .005, suggesting that emotion effects on memory are relatively stable from encoding to retrieval. Supplementary Results: “Recollected” versus “Definitely old” To evaluate whether item-level similarity varied by whether participants gave a “recollected” or “definitely old” response, mean distances for the item-level pairs were entered into a two-way repeated-measures ANOVA with memory response (“recollected”, “definitely old”) and emotion type (negative, neutral, positive) as factors. After correction for multiple comparisons, no regions showed a significant main effect of memory or memory by emotion type interaction, ps > .01, one-tailed. However, two regions did show a marginal main effect of memory: the hippocampus, F(1, 17) = 7.50, p = .007, and the middle occipital gyrus, F(1, 17) = 7.39, p = 007. In both regions, item-level similarity was greater for items receiving a “recollected” than those receiving a “definitely old” response. The absence of significant differences in most regions tested here is consistent with prior work reporting that cortical reinstatement occurs equivalently for both kinds of memory responses in frontal and temporal cortical regions (Johnson et al. 2009). Although it should be treated with caution, the hippocampal finding is compatible with the proposed role of the hippocampus in recollectionbased memory responses (Eichenbaum et al. 2007). Recollection involves an increased sense of “re-experiencing” the initial learning episode, which might be manifest as increased similarity between neural representations in the hippocampus. Future studies, especially those using high-

resolution fMRI, should clarify whether and when similarity effects can be reliably observed in the hippocampus.

Supplementary Table 1: Univariate ANOVA Results Region Main Effect of Condition Amygdala/ Hippocampus Parahippocampal Gyrus Fusiform Gyrus Superior Frontal Gyrus Middle Frontal Gyrus Superior Frontal Gyrus Medial Frontal Gyrus Precuneus Middle Frontal Gyrus Inferior Frontal Gyrus Inferior Frontal Gyrus Supramarginal Gyrus Superior Frontal Gyrus Midcingulate Gyrus Cerebellum Posterior Cingulate Middle Temporal Gyrus Middle Occipital Gyrus Superior Temporal Gyrus Main Effect of Phase Cuneus Posterior Cingulate Inferior Parietal Lobule Posterior Insula Middle Frontal Gyrus Medial Frontal Gyrus Anterior Cingulate Precentral Gyrus Middle Temporal Gyrus

hem

voxels

x

y

z

t

direction

L L L R R R L L R R R L L L L R R R R L L L L R L L R R L

1015 280 52 54 24 50 31 28 10 11 10 10 10 16

-15 -30 -33 33 22 37 -11 -11 4 11 11 -33 -48 -45 -52 48 41 45 56 -15 -19 -11 -4 7 -7 -4 37 37 -52

-11 -33 -48 34 55 44 56 59 59 -67 -56 41 22 29 19 33 23 -53 -49 35 27 32 -2 -56 -54 -54 -79 -83 -15

-6 -14 -14 33 8 15 22 11 11 42 48 22 6 -1 13 5 16 38 34 40 37 47 28 -29 10 17 18 11 1

58.74 47.44 44.51 44.90 36.80 34.98 44.38 34.99 24.39 39.42 14.67 30.41 29.68 28.98 26.71 28.90 20.40 28.69 18.07 28.34 19.82 15.49 19.62 18.53 17.43 15.96 16.61 14.72 15.68

+ + + + + + + + + + + + + + + + + + + + -

L L L L L L L L R L

534 275 286 73 55

-4 -7 -4 -48 -48 -41 -7 -4 22 -59

-68 -42 -34 -60 -32 13 41 44 -1 -51

31 34 40 38 19 34 26 5 49 -1

58.30 36.15 34.05 51.55 13.12 36.59 25.73 24.15 32.87 26.95

R>E R>E R>E R>E R>E R>E R>E R>E E>R R>E

Precentral Gyrus Precuneus Anterior Insula

L L R L

33 12 17

-56 -26 15 -30

-40 -2 -63 7

-4 42 56 -10

22.05 21.79 19.11 18.82

Middle Temporal Gyrus

L

23

-48

-69

10

15.88

Amygdala

R

8

33

-1

-22

6.91

R>E E>R E>R R>E

Main Effect of Emotion emo > neu emo > neu

Note: Up to 3 peaks spaced at least 8 mm apart are reported for each cluster. Coordinates are in Talairach space. For the direction column, + denotes a positive relationship with memory, denotes a negative relationship with memory. The interaction of phase and emotion was null. Hem = hemisphere., E = encoding, R = retrieval, emo = emotional, neu = neutral.

Supplementary Table 2: Distribution of memory responses by emotion type, mean (SD)

old items (out of 140) new items (out of 70)

negative positive neutral negative positive neutral

"definitely new" 0.05 (.07) 0.07 (.09) 0.09 (.14) 0.43 (.17) 0.40 (.17) 0.41 (.21)

"probably new" 0.15 (.11) 0.16 (.08) 0.21 (.12) 0.44 (.18) 0.40 (.13) 0.46 (.16)

"probably old" 0.21 (.09) 0.22 (.09) 0.23 (.11) 0.10 (.06) 0.15 (.08) 0.10 (.07)

"definitely old" 0.28 (.16) 0.28 (.15) 0.28 (.17) 0.02 (.02) 0.03 (.03) 0.02 (.02)

"recollected" 0.31 (.23) 0.26 (.19) 0.20 (.20) 0.01 (.01) 0.01 (.02) 0.01 (.01)

Supplementary Table 3: Post-hoc subregion analysis for ROIs displaying significant match by memory interactions Match

Memory

Match x Memory Region Subdivision F p F p F p Mid Occipital R post, inf 62.36 < .001* 16.35 .001* 0.04 .836 R post, sup 19.92 < .001* 22.82 < .001* 0.01 .920 R ant, inf 37.39 < .001* 7.08 .016 0.16 .692 R ant, sup 49.33 < .001* 19.43 < .001* 5.12 .036 L post, inf 106.51 < .001* 8.17 .010 3.89 .064 L post, sup 41.18 < .001* 9.14 .007 0.13 .721 L ant, inf 118.43 < .001* 40.04 < .001* 7.83 .012 L ant, sup 21.92 < .001* 15.68 .001* 13.04 .002* Mid Temporal R post, inf 2.54 .128 3.57 .075 2.00 .175 R post, sup 59.90 < .001* 12.90 .002* 11.78 .003* R ant, inf 7.89 .012 3.71 .070 1.76 .201 R ant, sup 5.09 .037 3.73 .069 1.13 .301 L post, inf 1.05 .319 0.04 .843 0.03 .870 L post, sup 36.23 < .001* 11.12 .004* 11.11 .004* L ant, inf 0.01 .921 12.32 .002* 2.13 .162 L ant, sup 0.46 .505 7.02 .016 3.44 .080 Supramarginal R post, inf 2.04 .170 3.10 .095 6.89 .017 R post, sup 0.66 .427 3.70 .070 1.07 .315 R ant, inf 4.98 .039 2.76 .114 4.86 .041 R ant, sup 1.19 .289 8.61 .009 5.71 .028 L post, inf 6.07 .024 13.17 .002* 8.67 .009 L post, sup 2.04 .170 12.71 .002* 6.70 .019 L ant, inf 1.44 .246 4.33 .052 2.03 .171 L ant, sup 8.12 .011 3.52 .077 0.43 .520 Note. Effects that remain significant after correction for multiple comparisons are denoted with an asterisk. R = right, L = left, ant = anterior, post = posterior, sup = superior, inf = inferior.

Supplementary References Buchanan T. 2007. Retrieval of emotional memories. Psychological Bulletin. 133:761-779. Daselaar SM, Prince SE, Dennis NA, Hayes SM, Kim H, Cabeza R. 2009. Posterior midline and ventral parietal activity is associated with retrieval success and encoding failure. Front Hum Neurosci. 3:13. Eichenbaum H, Yonelinas AP, Ranganath C. 2007. The medial temporal lobe and recognition memory. Annual Review of Neuroscience. 30:123-152. Huijbers W, Pennartz CMA, Cabeza R, Daselaar SM. 2011. The Hippocampus Is Coupled with the Default Network during Memory Retrieval but Not during Memory Encoding. PLoS ONE. 6:e17463. Johnson J, McDuff S, Rugg M, Norman K. 2009. Recollection, familiarity, and cortical reinstatement: a multivoxel pattern analysis. Neuron. 63:697-708. Murty V, Ritchey M, Adcock RA, LaBar KS. 2010. fMRI studies of successful emotional memory encoding: A quantitative meta-analysis. Neuropsychologia. 48:3459-3469. Spaniol J, Davidson PS, Kim AS, Han H, Moscovitch M, Grady CL. 2009. Event-related fMRI studies of episodic encoding and retrieval: meta-analyses using activation likelihood estimation. Neuropsychologia. 47:1765-1779.