• Keine Ergebnisse gefunden

As mentioned in the introductory paragraphs, error monitoring processes are often described as a feedback loop which continually monitors ongoing behavior and enables error detection as well as behavioral adjustments to optimize task performance. The work presented here investigated, (1) by which mechanisms error detection is achieved, and (2) how this information is used to optimize behavior. These questions are highly relevant for the general understanding of how the brain operates, because they relate to the problem of cognitive control, i.e., by which mechanisms ongoing behavior can be altered, when the complex and dynamic environment makes such adjustments

necessary.

The first question was addressed by investigating behavioral measures of error detection (Study [1]). The results favour the response monitoring account of error detection (Rabbitt & Vyas, 1981; Steinhauser et al., 2008) which posits that error detection is enabled by internal error correction due to an activation of the correct response during continued processing of the stimulus after the error.

The second question was addressed by investigating the relation between behavioral measures of error detection and the Ne/ERN as an electrophysiological correlate of error processing (Studies [2] and [3]). The results of these studies permit several conclusions on the role of the Ne/ERN in error processing. Particularly, the results favour error evaluation theories of the Ne/ERN (e.g., Hajcak et al., 2005; Holroyd &

Coles, 2002; Holroyd & Coles, 2008) which hold that the Ne/ERN indicates to the cognitive system the significance of errors for behavioral adjustments.

In the following paragraphs, the studies will be summarized, and their contributions to the field of error processing research will be discussed. Furthermore, implications for future studies will be outlined. In concluding remarks, a proposal for the architecture of a feedback loop for error monitoring and behavioral adjustment will be made.

5.1. Response monitoring as a mechanism for conscious error detection

Study [1] (Steinhauser et al., 2008) investigated, which mechanism within the cognitive system enables conscious error detection. Particularly, two possible accounts of error detection were tested. One possibility would be that this mechanism operates by detecting post-response conflict as stated by the conflict monitoring theory of error detection (Yeung et al., 2004). A second possibility would be that the cognitive system detects internal error corrections, which arise due to the activation of the correct response during continued processing of the stimulus after the error as proposed by the response monitoring account of error detection (Rabbitt & Vyas, 1981; Steinhauser et al., 2008).

The two accounts differ particularly in one important point. According to the response monitoring account, it is crucial for error detection that the activation of the correct response reaches a threshold during continued processing after an error. However, according to the conflict monitoring theory this is not necessary. This important

difference can be used to differentiate between the two accounts. This can be illustrated by considering general models of response selection in choice tasks, which view

response selection as the result of a continuous accumulation of evidence extracted from the stimulus, as, for instance in diffusion models (Ratcliff & Rouder, 1998).

According to these models, from the point in time of stimulus presentation, the cognitive system accumulates evidence for each possible response alternative. This leads to the activation of the respective responses. Whenever the evidence for a

particular response reaches a criterion, this response is executed. Errors occur, because evidence is also accumulated for wrong responses due to noise in the system.

Both of the mentioned accounts of error detection assume that after an executed response, the accumulation of evidence continues. In most cases, enough evidence is accumulated for the correct response during this extended processing, so that the correct response exceeds the criterion. According to the response monitoring account, this internal error correction forms the basis for error detection. Errors, which are not internally corrected, remain undetected. However, according to the conflict monitoring theory, error detection is independent of whether the evidence for the correct response accumulated during the extended processing reaches a criterion or not. Here, it is

sufficient that the activation of the correct response leads to a post-response conflict with the previously executed erroneous response. If this conflict exceeds a certain threshold, the system detects an error.

To test the two accounts, in a first step, the response criterion was manipulated in an Eriksen flanker task. The results showed that both error signaling and error correction latencies were clearly influenced by the response criterion. The higher the criterion was the longer were signaling and correction latencies. This supports the predictions by the response monitoring account, because the time until the internal correction reaches the criterion should be longer when the criterion is higher.

In a second step, it was tested, whether the conflict monitoring theory can account for these results, because the amount of post response conflict could well also be related to the response criterion. To this end, the neural network model by Yeung et al. (2004) was used. The results showed that the conflict monitoring theory implemented in the model could not adequately predict the criterion effect on signaling and correction latencies. However, the response monitoring account implemented in the same model excellently accounted for the data.

In sum, the results of study [1] support the response monitoring account, which states that errors are detected by means of internal error corrections. However, the results raise objections against the conflict monitoring theory of error detection, which holds that errors are detected by the detection of post-response conflict.

These results do not imply that response conflict monitoring should not be considered as a general principle in behavior monitoring. In other areas, such as the regulation of selective attention in order to control the influence of irrelevant stimulus information, response conflict monitoring could play an important role (Botvinick et al., 2001).

However, the results strongly argue against the view that error detection relies on the detection of post-response conflict. By proposing internal error correction, i.e., the activation of the correct response during continued processing after an erroneous response, as the basis for a mechanism of conscious error detection, the work presented here makes an important contribution to the understanding of error monitoring

processes in the cognitive system.

5.2. Dissociation of the Ne/ERN and error detectability

Study [1] showed that the response monitoring account of error detection is well suited to explain conscious error detection. Study [2] (Maier et al., 2008), tested the

assumption that the Ne/ERN as an electrophysiological correlate of error processing reflects error detectability. This view is held by error detection theories of the Ne/ERN.

Much debated error detection theories of the Ne/ERN are the conflict monitoring theory, which states that both error detection and Ne/ERN are based on post-response conflict (Yeung et al., 2004), and the mismatch hypothesis according to which both Ne/ERN and error detection are based on the mismatch of the executed and the intended response (e.g., Bernstein et al., 1995; Falkenstein et al., 2000).

Error detection theories of the Ne/ERN assume that the Ne/ERN represents the same information necassary for error detection. Therefore, anything influencing error detectability should influence the Ne/ERN in a similar manner. This prediction can be tested by comparing Ne/ERN and error detectability. If errors, which are easy to detect, also show large Ne/ERNs the prediction would be supported.

Therefore, study [2] compared Ne/ERN and error detectability across two different types of errors. An Eriksen flanker task with four response alternatives was used. In this task, two different types of errors can occur. First, flanker errors occur, when the

response associated with the flanker elements of the stimulus is accidentally executed.

Second, non-flanker errors occur, when a response is executed that is not mapped to any of the elements appearing in the stimulus.

Flanker errors occur particularly on trials, on which the flankers are co-processed along with the target to a large extent due to a sub-optimal selective attention. However, the sub-optimal selective attention on flanker error trials should at the same time cause a weaker activation of the correct response during continued processing after the error.

Therefore, we assumed that flanker errors are more difficult to detect than non-flanker errors, which do not particularly occur on trials with sub-optimal selective attention.

The detectability of the two error types was assessed by means of error signaling responses.

Indeed, flanker errors were signaled less frequently than non-flanker errors.

Surprisingly, however, flanker errors showed larger Ne/ERN amplitudes than non-flanker errors. In the second experiment of study [2], where stimulus processing was impaired by a masking procedure, these results were replicated. Furthermore, the Ne/ERN was dramatically reduced for un-signaled errors in this second experiment.

The fact that Ne/ERN amplitudes were larger for errors, which were harder to detect, fundamentally argues against error detection theories of the Ne/ERN. Because they assume that the Ne/ERN represents the same information, which also forms the basis for error detection, these theories obligatorily expect that the Ne/ERN reflects error detectability. However, the results reported here show the exact opposite.

The results are compatible with error evaluation theories of the Ne/ERN, according to which the Ne/ERN reflects the significance of errors for the optimization of the ongoing behavior. The risk for a flanker error can be reduced by enhancing selective attention on the target (in the sense of Botvinick et al., 2001). Non-flanker errors, on the contrary, cannot be prevented particularly by adjusting selective attention. Therefore, flanker errors should have a higher significance for the adjustment of behavior than non-flanker errors.

This increased relevance for the optimization of behavior could be conveyed to the ACC as a stronger dopaminergic reinforcement learning signal in the sense of the reinforcement learning theory of the Ne/ERN (Holroyd & Coles, 2002; Holroyd &

Coles, 2008). Also the finding of reduced Ne/ERN amplitudes for un-signaled errors is compatible with this possibility, because errors, which are not detected by the system, should not elicit a reinforcement learning signal.

5.3. Ne/ERN and error detection: Independent neural mechanisms?

According to error evaluation theories of the Ne/ERN, the Ne/ERN reflects the significance of errors for the optimization of behavior. This allows for the possibility that the Ne/ERN and error detection are based on functionally independent neural mechanisms. Consistent with this possibility, study [2] showed a differential influence

of an experimental variable, namely, the type of error, on the Ne/ERN and on error detectability.

The observed pattern of results was that errors, which are more difficult to detect, elicited smaller Ne/ERNs. However, this could also result, if the Ne/ERN and error detection were based on functionally dependent neural mechanisms, which always operate in a reversely related manner. In this case, the Ne/ERN would be always large, when an error is difficult to detect and vice versa. It can only be concluded that two measures are based on functionally independent neural processes, if they are completely dissociated, i.e., if they are influenced in different directions by one variable but in the same direction by another variable (cf., Dunn & Kirsner, 1988). Therefore, in study [3]

(Maier, Steinhauser, & Hübner, submitted), conditions should be created, where

Ne/ERN and error detectability are influenced in the same direction by an experimental variable.

This was achieved by the manipulation of response set size. It is well known that with larger response sets, i.e., with a larger number of available response alternatives,

performance is usually worse (Hick, 1952). This is often attributed to a greater response uncertainty (Card, Moran, & Newell, 1983).

Also error detection is impaired with larger response sets (Rabbitt, 1967). This is not surprising, because a greater response uncertainty should reduce the activation of the correct response during continued stimulus processing after errors and thereby impair error detection. Furthermore, it could be shown that the Ne/ERN is reduced with greater stimulus uncertainty (Pailing & Segalowitz, 2004). Therefore, we expected that both Ne/ERN and error detectability would be decreased with larger response set sizes.

To test this, study [2] compared error detectability and Ne/ERN across three groups of participants working on either a two-choice, a four-choice, or an eight-choice version of the Eriksen flanker task. Error detectability was assessed by means of error signaling responses. In addition, strategic behavioral adjustment following errors was measured by post error slowing, which denotes the slowing of response times of correct responses on trials following errors (e.g., Rabbitt, 1966a). The results showed that Ne/ERN, error detectability, and strategic behavioral adjustments following errors were reduced with a larger response set.

On the one hand, this demonstrates that error monitoring processes suffer severely from the increased response uncertainty with larger response set sizes. The impaired error detectability with increased response set size can be explained by contemporary theories of error detection like the reponse monitoring account: With a larger response set size, response uncertainty is increased. This causes a weaker activation of the correct

response during continued processing after errors, which directly leads to impaired error detectability.

The reduced Ne/ERN and the concurrently impaired behavioral adjustment following errors are compatible with many current accounts of the Ne/ERN. Error detection theories of the Ne/ERN would assume that a larger response uncertainty impairs the mechanism, which is relevant to both the Ne/ERN and error detectability. However, error evaluation theories of the Ne/ERN could explain the results equally well. For instance, according to the reinforcement learning theory (Holroyd & Coles, 2002;

Holroyd & Coles, 2008), an increased response uncertainty would trigger a smaller reinforcement signal to the ACC. This would cause both a smaller Ne/ERN and impaired behavioral adjustment with larger response set sizes.

On the other hand, if one combines the results of studies [2] and [3], the conditions of a complete dissociation of error detection and the Ne/ERN are provided. In study [2], errors, which were harder to detect, elicited larger Ne/ERN amplitudes than errors, which were easier to detect. This constitutes a negative relationship between error detectability and the Ne/ERN. In study [3], error detectability and the Ne/ERN were both impaired by a larger response set size, which constitutes a positive relationship between the error detectability and the Ne/ERN. This pattern of results permits the conclusion that error detection and the Ne/ERN are based on functionally independent neural mechanisms (see, Dunn & Kirsner, 1988 for conditions of a complete

dissociation).

5.4. Implications for future studies

The results from the studies presented here have some implications for future research.

They are compatible with error evaluation theories of the Ne/ERN, which assume that the Ne/ERN constitutes a signal for the optimization of behavior (e.g., Holroyd &

Coles, 2002; Holroyd & Coles, 2008). If this is correct, we can expect that in conditions with reduced Ne/ERN, also behavioral adjustments following errors should be reduced.

Indeed, in study [3], strategic adjustment on trials following errors in form of post-error slowing was impaired, when Ne/ERN amplitudes were also reduced due to a greater reponse uncertainty. A reduced Ne/ERN could therefore in fact cause a smaller amount of post error slowing. Similar findings were obtained also in studies using functional magnetic resonance imaging (e.g., Kerns, 2006; Kerns, Cohen, MacDonald, Cho, Stenger, & Carter, 2004). The further investigation of the relation of Ne/ERN and behavioral adjustments following errors could be fruitful for the understanding of the role of the Ne/ERN and the ACC in behavior monitoring.

From the idea that behavioral adjustments are guided by behavior monitoring processes, another implication for future works results. If errors are to be evaluated according to their significance for behavioral adjustments to optimize task performance, an important precondition is the ability of the cognitive system to distinguish between error types of different significance. Study [2] showed that the Ne/ERN indeed has this capacity.

Future studies could investigate different conditions, under which the Ne/ERN can be viewed as a marker for different types of errors. For instance, error types of different significance could be generated artificially by differential punishment or reinforcement.

Given differential punishment or reinforcement of different error types, behavior would be optimizable according to win and loss by avoiding or preferring the differentially weighted error types. If the Ne/ERN constitutes a signal for behavior optimization, its size should clearly reflect the differential significance of the error types. This would provide strong evidence for error evaluation theories of the Ne/ERN.

5.5. Concluding remarks

In sum, the results of the studies presented here permit some proposals for the functional architecture of a feedback loop, where a continuously operating response monitoring system enables the detection of errors in ongoing behavior. Other systems could then use this information for optimizing performance on the task at hand.

Such an architecture was proposed, for instance, by Holroyd and Coles (2002). In their model, a monitoring component of the loop is attributed to the basal ganglia, which then sends dopaminergic error signals to the ACC on the basis of values assigned to the events it monitors during reinforcement learning history. The Ne/ERN as the ACC response to the error signals reflects these values.

In the present work, it was shown that errors in ongoing behavior can be detected by continuously monitoring the response selection system for internal error corrections.

Furthermore, it was shown that the size of the Ne/ERN as an electrophysiological correlate of error processing can be independent from error detectability. Finally, the results are compatible with the view that the Ne/ERN is a signal that serves for marking errors according to their relevance for behavioral adjustments.

Therefore, we propose that errors are detected by automatic continuous response monitoring. A detected error could then be assigned a negative value in the sense of the monitoring component in the model of Holoyd and Coles (2002). Then, a dopaminergic error signal could be sent to the ACC to subsequently trigger behavioral adjustments.

An error detection component which is distinct from the value monitoring component postulated by Holroyd and Coles (2002) has the advantage that it can also account for dissociations of the Ne/ERN and error detectability.