The Missing Pieces of the Puzzle: A Review on the Interactive Nature of A-Priori Expectancies and Attention Bias toward Threat

The role of attention bias in the etiology and maintenance of anxiety disorders has been studied extensively over decades. Attention bias reflects maladaptation in cognitive processing, as perceived threatening stimuli receive prioritized processing even when they are task-irrelevant or factually unthreatening. Recently, there has been some interest in the role of a-priori expectancies in attention bias toward threat. The current review article will present recent studies as examples that emphasize the need for more comprehensive research about the interactive effects of various factors that affect the relationship between expectancies and attention bias toward threatening stimuli in anxiety. The current review article suggests a holistic view, which advocates for more integrative research, as a dynamic network could underlie changes in attention bias. The study of the interaction between such factors, with a focus on expectancy, can lead to more ecological and clinically important results, and thus to more informed and fine-tuned treatments that are based on manipulation of expectancies. Such methods, in turn, can also help in shedding light on the research of attention bias, in a mutual relationship between research and therapy.


Attention Bias toward Threat: Manifestations and Interactions with Other Cognitive Biases
Attention bias toward threatening stimuli is defined as the disproportionate allocation of attentional resources toward potentially threatening stimuli (for a review, see [1,2]). For instance, a person with snake phobia might fixate on a picture of a snake, or on a previous location of this picture on a computer screen. This reaction could be considered unnecessary and excessive, especially in the experimental setting, where participants are asked to respond as quickly as possible to a neutral task-relevant target and ignore task-irrelevant threatening information. In this case, staring at a picture is a waste of attentional resources and leads to a decline in performance, as one fails to keep up with experimental demands. However, if one sees a snake while taking a hike, taking note of its features (e.g., size, colors, pattern and speed) may be of value. Attention bias is believed to play a major role in the etiology and in the maintenance of anxiety disorders [3]. Therefore, studying the relation between attention bias and anxiety is of high importance, from theoretical and clinical points of view.
Attention bias is suggested to consist of three components: faster engagement with threatening stimuli compared to neutral stimuli; slower disengagement from threatening stimuli compared to neutral stimuli; finally, following these initial components of vigilance, there is an avoidance component, in which participants avoid the fearful stimulus [4]. Several paradigms are used to study different components of attention bias, and they usually involve the detection of some kind can be measured and manipulated in various ways, implicitly and explicitly. Additionally, biased expectancies to encounter threatening stimuli can be a trait characteristic, as will be discussed below.
In the current review article, therefore, we present several cognitive, affective, and personality factors, which affect the relation between expectancy and attention. We discuss recent findings in order to clearly illustrate the complexity and the dynamic network which might modulate attentional biases and their relation to expectancy. The current review article further aims at offering several suggestions for new directions in the field of attention bias and attention bias modification (ABM), and specifically on ways in which expectancies can affect and reduce attention bias. The discussed factors may act in unison or they may complement each other, depending on the disorder. Thus, mapping the varying factors and the different interactions between them across disorders may help clinicians to accurately diagnose and to effectively treat these disorders. This can be especially beneficial for the development of more effective ABM treatments that aim to reduce anxiety symptoms by reducing attention bias. In subsequent sections, we will present several examples of critical factors that could modulate attention bias using a better understanding of a-priori expectancies. Some of the presented studies do not relate to attention bias per se, but instead refer to prioritized processing of emotional stimuli, which is key in attention bias and in cognitive biases in general (for review on the dynamic modulation of emotional processing, see [15]).

The Influence of Exogenous vs. Endogenous Factors on Attention Bias to Threat
A recent opinion article emphasized the importance of integrating "goal-directed cognitive control" and "salience-driven" processes when trying to reduce attention bias using ABM [16]. Goal-directed cognitive control processes refer to "multiple processes supporting goal-directed thought and action. It includes attention control functions (goal-directed inhibitory attention control, flexible switching, controlled orienting), as well as reason-based evaluation/appraisal, and updating and maintaining information in working memory" ( [16]; p. 226). These processes can also be referred to as "top-down" executive control processes. Salience-driven processes, on the other hand, refer to the fact that "stimulus salience varies along various dimensions including aversive (threat-related) and appetitive (reward-related) motivational salience, and perceptual salience (e.g., brightness, contrast). Motivational salience is influenced by stimulus content, individual differences, and situational variables (e.g., learning history, current motivational state, context)" ( [16], p. 227). These processes can also be referred to as "bottom-up" processes. Both types of processes are interactive and complex, and include many different variables that affect attention bias. Notably, Mogg & Bradley [16] included learning history as a salience-driven process. Awh, Belopolsky and Theeuwes [17] suggested that a dichotomy between top-down and bottom-up factors is too simplistic, as some factors, such as selection history, are both top-down and bottom-up, depending on current goals and physical salience. Here, we will use the terms "endogenous" and "exogenous" factors, reflecting processes that are more internally generated or more externally generated, respectively.
In their review on cognitive mechanisms underlying anxiety and its treatment with ABM, Mogg, and Bradley [18] suggested an integrative framework. According to this framework, there is an imbalance between cognitive control and stimulus-related factors in anxiety, thereby launching a "threat-focused motivation mode" (p. 89). This mode has been proposed to include non-adaptive patterns of processing and reactions in various domains, such as attention, memory, reasoning, and stimulus-evaluation. In the study of attention bias, many studies focus on exogenous factors, such as pop-out and salience, which capture attention. For instance, approximately two decades ago, Öhman, Flykt, and Esteves [18] pointed out that snakes can be salient and pop-out, especially among snake fearful participants. More recent studies have also found that participants can detect curvilinear shapes faster than other shapes, especially when labeled as "snakes" [19]. A recent event-related potentials (ERP) study has confirmed that snakes receive prioritized processing (i.e., larger early posterior negativity amplitudes) compared to other curvilinear animals, such as spiders, worms, and beetles [20]. Thus, it seems that curvilinear shapes quickly capture attention, especially when associated with threat, suggesting that there is an interaction between exogenous factors (i.e., specific shapes) and endogenous factors (i.e., pre-existing fear and perceived threat).
Along similar lines, a recent study investigated the impact of emotionally-arousing negative stimuli in exogenously capturing attentional resources during visual search on natural scenes (i.e., during a top-down, endogenous task). Behavioral and eye movement data suggested that emotional distractors were attended to later and for shorter durations, compared to emotional targets, suggesting effective top-down control. Moreover, functional magnetic resonance imaging (fMRI) data suggested a selective increase of insular/prefrontal connectivity to face emotional distraction [21]. Thus, this study demonstrates an interaction between attention and task-relevance using a highly ecological paradigm and various measurement methods (behavioral, eye movements, and fMRI). Different lines of research further showed that the effect of negative/threatening stimuli not only biases attention, but also has important consequences on post-perceptual/attentional processes, e.g., on short-term memory [22] and prospective memory [23]. Note that these latter "memory" effects are based on the interplay between exogenous capture of negative stimuli and endogenous task-based demand. Together, these studies suggest an interaction between endogenous and exogenous factors, as well as an interaction between attention and memory. Hence, while factors that affect attention interact, attention itself also interacts with other processes, such as expectancies, memory (sometimes in the form of a-posteriori expectancies), etc.

The Effect of Pre-Stimulus Factors on Attention Bias: Interaction between Expectancies, Uncertainty and Valence
One of the factors that affects attention is task relevance, which refers to the relevance of certain stimuli to a specific task. Some stimuli could be irrelevant and thus act as distractors, while still capturing attention and hindering performance (e.g., [24] on the effects of spider distractors on the performance of participants with spider phobia). Task relevance has also been found to play a role in specific types of attention bias: in a sample of healthy participants, Gronau and Izoutcheev [25] found that generally, task-irrelevant distractors can be successfully ignored. However, the same distractors can affect behavioral performance if they contain images from a to-be-detected category. Dodd, Vogt, Turkileri, and Notebaert [26] found that trait anxiety is associated with decreased performance on a visual search array when angry faces were used as task-irrelevant stimuli, but not when they were task-relevant. Thus, task relevance interacts with threat value and pre-existing pathology (see also [27] for similar findings among participants with anxiety).
Recent studies suggest that there are additive effects between task-relevance and a-priori expectancies. Specifically, participants were informed regarding the likelihood of encountering a certain target color out of two possible target colors amidst 4 colors (i.e., 3 distractors) in a visual search array [28]. Participants were asked to indicate the orientation of rectangular bars in target colors. The search array was preceded by a spatially uninformative cue which indicated the upcoming target color half of the time. Behavioral and ERP results (N2pc) showed that cues for both target colors attracted attention, even though participants were informed that one color was more likely to appear (80%) compared to the other color (20%). Thus, cues that act as preparatory attentional templates only reflect relevance, rather than statistical expectations regarding the upcoming targets. Additionally, results showed that during the search, expectancies regarding probabilities of target colors did have an effect, as reaction time and the onset of the N2pc component were delayed when the infrequent target color appeared [28]. Thus, relevance and a-priori encounter expectancies differentially affect attentional capture.
Along the same lines, embedding threatening stimuli in previously learned visual contexts may facilitate their detection, as opposed to threatening stimuli in new contexts and to non-threatening stimuli in old contexts [29]. The facilitative influence of visual contexts might be mediated by changes in critical expectancies. Hence, the authors concluded that participants use contextual cues in order to detect threat. In fact, there is evidence that visual contexts can shape expectancy regarding which objects are likely to appear within a given context. Thus, in a given visual context, one would expect certain objects to appear in specific places. Accordingly, one study found that objects that were placed in correct locations (e.g., an umbrella over a hoodie) were detected faster, compared to objects that were placed in incorrect locations, (e.g., a hoodie over an umbrella; [30], Experiment 1). Specifically, in Experiment 1, both objects were attended and spatial consistency (i.e., placings that made sense) led to facilitation. In a second experiment, only one of the two objects was attended (i.e., cued) and task-relevant, while the other object was unattended (i.e., uncued) and became irrelevant due to different instructions. An interaction between spatial consistency and task-relevance was found, such that the effect of spatial consistency disappeared when an object became task-irrelevant ( [30]; Exp. 2). Thus, when one object leaves the focus of attention, expectancy to find a certain object in a given visual context no longer facilitates its detection. Taken together, the aforementioned studies suggest that there is an interaction between higher cognitive factors, such as task-relevance, or lower level factors, such as previously learned visual contexts, with expectancies and attention. The study of such factors and their interaction with expectancy could have important theoretical and clinical implications in better understanding attention bias and ABM procedures.
Sussman, Jin, and Mohanty [31] emphasize the role of prestimulus factors, (e.g., goals and expectations), as well as the role of the interaction between exogenous and endogenous factors, in threat-related perception and attention in anxiety (see also [32]). For instance, a-priori anticipation to encounter aversive stimuli (i.e., expectancy bias) lies at the heart of all anxiety disorders, and this bias can also interact with attention bias (for a review, see [33]). It has been suggested that inducing a-priori expectancy to encounter a certain stimulus increases the detectability of the stimulus by increasing perceptual sensitivity to its features [34]. Thus, there is an interaction between perceptual sensitivity, expectancy, and attention. Along these lines, Sussman, Szekely, Hajcak, and Mohanty (2016; [35]) emphasize the role of anticipation as it subsequently shapes perception of threat, which is enhanced in anxiety. Such perceptions of threat may also affect attention allocation toward threat. Several studies have already integrated expectancy cues that predict the threat value of the subsequent stimulus into attention bias paradigms, thus advocating for more integrative research (dot probe: [36]; visual search: [37][38][39][40]).
Aue et al. [38] manipulated expectancy in order to measure its effects on attention bias in spider phobia. Specifically, participants were presented with a visual search array, which included 9 pictures: 8 neutral distractors (butterflies) and 1 deviant target, which was either neutral (i.e., birds) or threatening (i.e., spiders). Participants were asked to detect the deviant picture as quickly and as accurately as possible. Prior to the presentation of the search array, a verbal cue appeared, indicating the likelihood of the appearance of a specific target ("bird 90%", "spider 90%", or "bird spider 50%"). In fact, the first two types of cues were valid on 71% of trials. Results showed that a-priori expectancies (i.e., cues) strongly affected the detection of neutral stimuli (i.e., faster reactions on congruent trials compared to incongruent trials in which bird targets appeared). Notably though, it did not affect detection of spider targets in either group of participants (with and without spider phobia), when both types of targets appeared equally often. Thus, a-priori expectancies differentially affected the detection of each type of target (birds vs. spiders). Further correlational findings in another study, nonetheless, suggest that there is an association between expectancy bias and attention bias for threatening stimuli ( [41]; for a review, see [10]), but causality in this association needs further investigation.
Follow-up studies suggest that in fear of spiders, there is a competition between valence and certainty. Specifically, expectancy was manipulated in the same way: using a verbal cue indicating the probability of encountering one of two targets in a following visual search array, with either bird or spider targets. Here, however, the frequency of the appearance of each target was manipulated, while keeping the cue-target congruency rate constant (i.e., 71%). When bird targets appeared more often than spider targets in the same paradigm, both low and high fear groups exhibit reduced attention bias toward spiders and reacted according to the cues ( [37]; Experiment 1 and 2). This conclusion is also in line with the finding that ABM-positive-search training, in which participants are trained to attend to positive stimuli, could be more effective than negative-search-training, in which participants are trained to attend to negative stimuli. This could be due to the fact that positive-search training encourages participants to actively use goal-directed inhibitory control, thus ignoring task-irrelevant threatening stimuli and focusing on more positive, task-relevant targets [42,43]. Additionally, when spiders appeared more often than birds, a reduction of attention bias was also found, but not as strongly as in Experiments 1 and 2 in which bird targets appeared more often ( [37]; Experiment 3). As reduction of attention bias was found when either target (bird or spider) appeared more often, and both were task-relevant each time, it is possible that frequency of a specific target indicates certainty: when both targets appear equally often, participants feel the least certain about encountering any specific stimulus and consequently do not use the cues. However, when either target appears more often, participants may feel more certain that they will encounter a specific target and thus exhibit a reaction that is more in line with the cues.
Using the affective cueing paradigm, an ERP study, Gole, Schäfer, and Schienle [44], manipulated certainty using three types of conditions: certainty-negative (i.e., the negative cue is always followed by a negative picture, certainty-neutral (i.e., the neutral cue is always followed by a neutral picture), and uncertainty (i.e., uncertain cue followed by either a negative or a neutral picture). In addition, two groups of participants were included: low and high in intolerance of uncertainty (IU). This measure reflects participants' ability to cope with life's various uncertainties about the future [45]. Results showed that during exposure in the later time windows (late positive potentials; LPP2: 750-1000 ms), an interaction emerged between cue (certain, uncertain), category of outcome (aversive, neutral) and group (low IU, high IU), such that only among high-IU participants, processing of aversive pictures was affected by the type of the preceding cue. Concretely, in high IU participants, LPP2 amplitudes were smaller following the uncertain cue-aversive picture condition, compared to the certainly negative condition. The authors concluded that uncertain cues received heightened allocation of resources, thus making these resources less available for emotional processing during exposure [44]. In this study, uncertainty was measured as a trait quality (IU) and predictability was manipulated as an experimental factor (see also Reference [46]).
Using a similar paradigm and shorter anticipation phases, Lin et al. [47] found that valence (positive and negative pictures) modulates the effect of IU on allocation of attentional resources when viewing emotional pictures, such that during early sensory processes (P2: 130-180 ms), experimental uncertainty reduces attention to negative pictures, and enhances it during later sensory processes (N2: 220-300 ms). Experimental uncertainty did not affect attention toward positive pictures. Thus, uncertainty only affected attention toward negative stimuli. While Gole et al. [44] found increased attentional resources during uncertain anticipation, Lin et al. [47] found that the valence of subsequent stimuli further modulates this effect. Hence, certainty/expectancy and valence/threat value seem to interact during emotional processing and allocation of attentional resources As threat can affect attention bias, so can reward. For instance, Kress et al. [48] found that optimistic expectancies can affect attention. Specifically, verbal cues indicated the likelihood of encountering certain stimuli, which were associated with either monetary loss or gain (i.e., reward). Results showed that the effect of optimistic expectancies on attention deployment was stronger than the effect of pessimistic expectancies. Importantly, this finding was found among a sample of healthy participants, and so these findings might differ when presented to anxious participants, especially if the presented stimuli are disorder-relevant (e.g., spiders to participants with spider phobia).
In the domain of expectancy and attention bias toward positive stimuli, there is evidence for bidirectional causality. It has recently been established that a-priori optimistic expectancies affect positive attention bias [48], and that ABM toward positive stimuli can enhance optimistic expectancies ( [49]; see [50] for suggested neurobiological mechanisms underlying such interplay between the biases). Considering many shared factors between negative and positive emotions, the bidirectional causality in the context of positive stimuli suggests that it might also exist in the case of negative stimuli (for reviews on expectancy and attention bias toward positive stimuli, see [13,51]).
Reward can further interact with other factors, such as prior attentional history, which is the history of attentional allocation toward certain items, and has also been found to play a role in visual attention (for a review, see [52]). For instance, if participants had been trained to shift their gaze away from a certain feature due to motivational reasons, (i.e., monetary reward), they would find it difficult to direct their attention away from the same feature once again on subsequent tasks, even if this feature is now task-irrelevant and this behavior conflicts with the new tasks' goals ( [53]; Experiment 1). In other words, although participants were trained to look away from certain features, they found it difficult to do it again on a different task, as they have learned that the same specific feature is related to a reward. Thus, prior history and prior motivation can interact and bias current orienting and direction of visual attention. Hence, factors that affect the interaction between expectancies an attention may also interact with other, seemingly unrelated factors, creating a complex network of interactions.
To summarize, many variables could affect the interaction between expectancy and attention bias toward threat. Specifically, while some factors (e.g., task-relevance, salience,) are more directly related to expectancy, other factors (e.g., attentional history) have not been directly examined in this context. A thorough manipulation of these factors can lead to a more comprehensive understanding of expectancy, attention bias, and their interaction in health and in anxiety disorders. Importantly, these factors cannot be easily divided into top-down or bottom-up, as attentional history can be affected by both processes. Similarly, salience could be considered a bottom-up factor, but it also interacts with pre-existing pathologies and other individual differences.

The Role of Individual Differences in the Expectancy-Attention Interaction
Pre-existing pathology can also affect expectancies and attention bias. For instance, whereas control participants showed an attention bias toward positive distractors and no bias toward negative distractors, anxious participants exhibited attention bias toward negative distractors although these were task-irrelevant, and not towards positive distractors [27]. Depressed participants did not exhibit an attention bias toward either negative or positive distractors [27]. Similarly, sensitivity to and response criteria for emotional faces were suggested as independent factors which influence perceptual sensitivity in individuals with social anxiety [54]. This sensitivity was specific for mildly angry faces and was not found toward mildly happy faces. These findings also imply an interpretation bias, in that anxious participants interpret neutral stimuli as threatening. A recent study has also simultaneously assessed the roles of attention, memory, and evaluation processes in health anxiety [55]. It was found that compared to participants with depression and to healthy participants, participants with pre-existing pathological health anxiety exhibited stronger attention bias to symptom-and illness-related words, more biased recognition of health-threat-related words, and more negative explicit evaluations of health threat.
Moreover, attention bias varies within anxiety disorders and even within specific phobias. For instance, participants with spider phobia and participants with blood-injection-injury (BII) phobia differ in terms of attention bias patterns. Oftentimes, general hypervigilance is found in BII phobia (e.g., [56]), while at other times, avoidance is found (e.g., [57]). Similarly, BII phobia differs from spider phobia in terms of expectancies and in terms of disgust and disgust sensitivity, as disgust is suggested to be the predominant emotion in BII phobia, while fear is suggested to be the more dominant emotion in other specific phobias (for a review, see [58]). Lastly, it has been suggested that attention bias in specific phobias may differ from attention bias in generalized anxiety [59].
A-priori and a-posteriori expectancies seem to differ between small animal phobias and BII phobia. For instance, Pury and Mineka [60] randomly paired aversive or neutral outcomes (shock, neutral tone, no outcome) with BII-related and neutral pictures. These were presented to participants with low and high BII fear level. Results showed that post-experimentally, all participants (i.e., with low and high BII fear levels) equally overestimated the co-occurrence of BII-related pictures with a shock, indicating the existence of a-posteriori/memory bias. De Jong and Peters [61] found somewhat different results. They presented neutral and BII-related pictures with different outcomes: no consequences (neutral), drinking 5 mL of harmless but bad tasting solution (disgust), or receiving an electrical shock (harm-related). Participants were low and high in BII fear. A-priori consequence bias was found, such that all participants overestimated the likelihood of co-occurrence between BII-related pictures and disgust-and harm-related outcomes. (See [62] for similar results using a thought experiment. Of note, in this study, the bias was stronger in the high BII fear group, but still present in the low fear group.) However, all participants went through online adjustment. In other words, they managed to change their expectation during the experiment, between a-priori and a-posteriori measurements, as a-posteriori expectancy bias was not found. Thus, participants underwent a learning process in which they (likely unconsciously) realized that stimuli were randomly paired with outcomes.
As mentioned earlier, disgust and disgust sensitivity have been found to affect fear reactions in specific phobias, as well as in non-clinical individuals. For instance, among a non-clinical sample, it was found that spiders, snakes and parasites induced strong feelings of disgust and fear [63]. Across participants, a strong correlation emerged between reported fear and disgust ratings of the animals. In addition, less fearful participants rated (harmless) grass snakes as less fear-evoking than vipers, compared to snake fearful participants. Thus, the authors conclude that this effect of overgeneralization of fear to harmless animals can be used as a clinical screening tool for animal phobia [63].
Disgust may be considered a unique emotion. Specifically, using the dot-probe paradigm, a recent ERP study found that unlike fear and anger, disgust leads to avoidance for evolutionary reasons, as it diverts attention away from the relevant stimulus, rather than maintains it [64]. This difference was greater among participants with high disgust sensitivity. Thus, disgust and disgust sensitivity affect emotional reactions toward threatening stimuli, which also depend on preexisting anxiety, perhaps due to evolutionary reasons. The authors suggest a model, according to which early sensory processes enhance the perception of fear and anger while suppressing disgust, while later processes regulate emotion and behavior.
To summarize, both attention and expectancy biases differ in different anxiety disorders, even within specific phobias. However, the interaction between both biases has yet to be studied extensively in different disorders. Thus, it is important to take existing psychopathologies into account when studying expectancies and attention and when attempting to treat attention bias more effectively.

Conclusions and Future Directions
The current review presents recent studies that have examined factors which affect expectancies and attention bias. Specifically, these studies suggest that a-priori expectancies can have various effects on attention bias toward threatening stimuli in anxiety. Importantly, expectancies can be manipulated in various ways, and thus different factors can affect the interaction between expectancies and attention, such as IU, pre-existing anxiety, task-relevance, and visual context. We present these findings as examples of possible factors, out of many, which could affect attention bias in a dynamic network, with a focus on a-priori expectancies. As Parsons et al. [7] demonstrated, the interaction between cognitive biases can be examined more systematically, using a moderated network modelling approach. The study of the interaction between various factors that affect attention bias could also benefit from a similar line of study, especially if causality can also be included. Consequently, the study of interactions between biases and specifically between expectancy and attention bias can also have clinical implications.
As attention in general and attention bias in anxiety in particular are vastly studied, an integratory line of research may be of great importance. Cognitive biases do not exist in a vacuum, and no singular characteristic or bias can help in characterizing different disorders. For instance, cognitive biases are differentially expressed in anxiety and in depression [12]. Specifically, Richter et al. [12] developed a battery that simultaneously examines different cognitive biases in anxiety and in depression. Thus, both disorders were differentially characterized in terms of the various cognitive biases involved. This differentiation can assist to develop better diagnostic treatments and tools for anxiety, depression, and other psychiatric disorders. In line with the current review, Richter et al. [12] suggested that such a battery can be a support system for diagnoses and treatment possibilities and that developing such protocols for treatment can help in the diagnostic process. Such a line of research also complements other recent studies which focused on the integration of various cognitive biases in cognitive disorders (e.g., [65][66][67]). Future studies can develop similar batteries for various disorders, with an emphasis on expectancies, especially in anxiety disorders [33].
As research integrates more elements, a more holistic view can be formed regarding these patterns of biased processing of negative emotional stimuli. Similarly, as research considers additional factors that influence the processing of emotional stimuli, a fuller picture can be formed. This understanding of the deeper complexities of emotional processing can help in developing more fine-tuned treatments and cognitive training options. Such options include machine learning, which tailors training according to each individual's needs [68]. Another option involves ABM, as it aims at reducing anxiety by reducing attention bias and vice versa (for a review, see [3]). Specifically, ABM trains participants to attend to specific stimuli, thereby reducing attention bias to threatening and anxiety-related stimuli. This process usually involves a manipulation of frequency of the appearance of threatening and neutral stimuli in order to increase reactions aimed to attend or avoid threat (for review on ABM, [43]). We propose that one way of achieving that is by inducing expectancies and certainty (e.g., frequency, predictability) during training, as both factors can be "goal-directed" and "salience-driven". We suggest that the interaction of attention bias with other cognitive, motivational, affective, and personality factors and biases (e.g., expectancy bias) may help explain greater variance in emotional responses and may be of use when developing cognitive training.
A recent review has suggested that ABM can be very unreliable (e.g., [43]). As McNally [69] has pointed out, this crisis of unreliability can be an opportunity for growth. The world of ABM and cognitive training is expanding into new and innovative directions, which include the integration of perception, learning, and memory with individual differences [70] and individually tailored trainings based on machine learning [68].
To summarize, we have briefly reviewed recent literature showing that there is a mutual influence between different cognitive biases and between different cognitive, motivational, affective, and personality factors. Findings suggest that many of these interactions play a role in emotional responses and in attention bias. We propose that effective training of attention can be established using a manipulation of other factors, such as expectancy, which could be added to manipulation of attention bias in ABM. Further research and advanced analysis strategies could establish innovative trainings, which in turn may also shed light on the complexities of emotional responses and attention bias toward anxiety-relevant stimuli.