ben waltz 935 ch05
DESCRIPTION
counterdeceptionTRANSCRIPT
-
C H A P T E R 5
Principles of CounterdeceptionChapters 1 through 4 provided a general introduction to deception by describing itsincreasing role in the global security environment, the various models and theoriesthat have been developed in order to describe and understand it, the various biasesthat contribute to its seemingly almost certain success, and the variety of technicaland nontechnical methods that support the conduct of deception operations. Wealso proposed that there are four fundamental principles that form the foundationfor the different models and methods of strategic deception. Now, we turn ourattention to the topic of counterdeception and attempt to answer several questions.What is counterdeception? What guidance can be found in the literature for coun-tering deception? Is there a corresponding set of fundamental principles ofcounterdeception that can guide analysts and decision-makers? What can be doneto overcome the various biases that we saw contribute to deceptions success? Whattechnical and nontechnical means can be employed to counter strategic and tacticaldeception operations? And, perhaps the most intriguing question of all: Is BartonWhaley right? Is deception almost always successful as the historical evidenceimplies? This chapter addresses the first three of these questions. After definingwhat we mean by counterdeception, we will examine the variety of models, con-cepts, and approaches found in the literature related to counterdeception in thenational security context. We will then examine the common themes that emergefrom this work, and derive a set of basic principles of counterdeception.
5.1 What Is Counterdeception?
The U.S. Department of Defense [1] defines counterdeception as: Efforts to negate,neutralize, diminish the effects of, or gain advantage from a foreign deception oper-ation. Counterdeception does not include the intelligence function of identifyingforeign deception operations. This is an operationally oriented definition thatemphasizes mitigating deceptions effects (like surprise) and exploiting knowledgeof the adversarys deception. Our focus will be primarily on the intelligence func-tion part of that definition, but the concept of counterdeception goes beyondjust identifying foreign deception operations. We believe that the purpose ofcounterdeception is to find the answers to two fundamental and highly interdepen-dent questions. First, counterdeception must make it possible for analysts anddecision-makers to penetrate through the deception to discern the adversarys realcapabilities and intentions, in other words, to answer the question: What is real?Simultaneously, analysts and decision-makers must determine what the adversary istrying to make them believe in order to consider the second question: What does the
143
-
adversary want you to do? The answers to these two questions are absolutelyessential to the success of ones own strategies, policies, and operations.
The intelligence aspects of counterdeception are aimed at detecting, characteriz-ing, and penetrating foreign deception operations. It is important to keep in mindthat there is no sharp demarcation line between normal intelligence activities andcounterdeception intelligence activities. This is because no such line exists betweenthe adversarys normal security activities and his calculated deception operations.Although large, sophisticated deception operations (like Plan FORTITUDE inWorld War II) are rare, as we saw in Chapter 2 deception itself is a phenomenon thateveryone experiences in one form or another on a daily basis. This presents intelli-gence analysts and decision-makers with a paradox: Deception is simultaneouslyboth common and rare. As a result, analysts face a continuum of deception rangingfrom basic security activities aimed at the deliberate concealment of facts, to sourceswho engage in deceit and misrepresentation for personal reasons (e.g., a humanasset who fabricates information in order to remain on an intelligence organiza-tions payroll), to deliberate ad hoc official deceit, and finally, to deliberatewell-planned, well-coordinated deception operations. This is why counterdeceptionin the national security context is more than just detecting deception. Just what kindof deception are we trying to detect? How do we distinguish between deliberatedeception and the types of misperceptions that Jervis describes? As Rossa points out[2]: Faced with an array of information on a subject, the analyst who is to put thepieces of the puzzle together must first determine which pieces to use and which todiscard or reshape on the basis of whether it was obtained despite foreign denialoperations, as a result of foreign deception operations, or in the absence of either.This leads us to conclude that counterdeception is characterized by three dimensionsof action: awareness, detection and exposure, and discovery and penetration.
Awareness primes the observer to register cues in the environment that signifyeither a threat or an opportunity. Anyone who has ever taken a personal securitytraining course knows that awareness is considered the first line of defense; beingaware what is happening around you often allows you avoid trouble before it evenhappens. Awareness is also analogous to the activation step in the Johnson et al.fraud detection model. The auditor is aware of certain cues that, if detected, lead tofurther questioning of the financial statement. A simple example of awareness in theintelligence context is when an analyst recognizes that a situation presents the adver-sary both the opportunity and motive to employ deception.
The detection and exposure dimension involves intelligence collection and anal-ysis activities that are aimed at determining what the adversary is trying to make youbelieve and, as a result, what he wants you to do [3]. In essence, the objective is toaccurately reconstruct the deceivers deception story from the data and informationavailable. The discovery and penetration dimension, on the other hand, focuses onrevealing what is real. In this case intelligence collection and analysis assets are usedto sort out the relevant from the irrelevant and the real from the false in order todetermine what are the adversarys real capabilities and intent [4]. These two dimen-sions are not independent. They are highly coupled and interdependent and bothemploy similar processes and methods to reveal that which is concealed, separatedeliberate distortions from unintentional misperceptions, and disentangle the realfrom the false in order to determine what really to believe.
144 Principles of Counterdeception
-
5.2 The Search for Ways to Counter Deception
In Chapter 2 we saw that much of the literature related to military and strategicdeception concentrates on the historical description and analysis of deception oper-ations and their associated methods. The 1970s saw the beginnings of a theoreticalphase of deception analysis where several authors used basic principles from psy-chology, systems engineering, communications theory, and other fields to begin thedevelopment of conceptual models of the process of deception itself. In this litera-ture, the subject of counterdeception is, if addressed at all, treated almost as anafterthought. An author might devote a few paragraphs or perhaps a section of achapter or paper to the topic. As Harris [5] observed in 1973, There is hardly anadequate theory of deception, much less a theory of counterdeception. Events inthe late 1990s (e.g., the 1998 Indian nuclear test and especially Iraqs efforts to hideits WMD program) generated significantly more interest in counterdeception; how-ever, the literature on topic is still relatively sparse. This section mirrors theapproach taken in Chapter 2 and summarizes the various counterdeception concep-tual models, theories, and approaches that can be found in the literature. LikeChapter 2, they are presented in rough chronological order so that the reader cansee how the thinking about counterdeception has changed over the years.
5.2.1 Early Pioneers [6]: Is there, then, no way by which the target ofstratagem can untangle the web of deceit?
In 1942, R. V. Jones wrote [7]: No imitation can be perfect without being the realthing [7]. The implication of this observation is that imitations should differ fromthe real thing in one or more ways, that is, observations made of the imitationshould be inconsistent with those of the real object or event, thus leading to Jones tothe conclusion that [8], If there is inconsistency between the impressions derivedfrom the several channels, the potential deceivee would do well to suspect a decep-tion. Jones goes on to offer advice on what the target can do in this situation. First,he recommends [8] a critical reappraisal of the intelligence picture that shouldinclude examining afresh the evidence coming in through each channel in turn, andparticularly those channels giving conflicting evidence. In addition, there are otheractions that analysts can take based on two principles that Jones offers for unmask-ing deception [9], (1) in any channel of intelligence through which you may bedeceive, arrange to work down to a greater level of sophistication than your oppo-nent expected you to adopt, and (2) bring all other possible channels of intelligenceto bear on the problem, to see whether the evidence that they can provide is consis-tent with the evidence in the channel through which you suspect you are beingdeceived. The first principle involves going beyond the obvious conclusionsoffered by an observation and subjecting the data to further scrutiny in search ofclues that might reveal inconsistencies. Examining the Doppler characteristics ofradio navigation signal is an example of this deepening examination of an infor-mation channel. If the source of the deceptive signal is ahead of an aircraft while thesource of the authentic signal is behind it, the frequency of the real signal should beslightly lower than that of the deceptive one, thus, in principle, unmasking thedeception [10]. An example of the second principle might be to double-check the
5.2 The Search for Ways to Counter Deception 145
-
observations of the radio navigation signal with other channels of information suchas inertial guidance systems or dead reckoning methods using magnetic compassand clock. Once again, the detection of inconsistencies is cause for suspectingdeception.
Jones also offers two maxims that are also quite relevant to counterdeception.The first [11] is Crows Law: Do not believe what you want to believe until youknow what you ought to know. As we saw earlier, knowing what you ought toknow will undoubtedly involve reappraising any evidence that is inconsistent withwhat you want to believe. The second is Occams Razor: Hypotheses are not to bemultiplied without necessity. Given missing, ambiguous, and contradictory infor-mation, analysts should seek the simplest hypotheses that will account for the infor-mation on hand. Jones points out that this will not necessarily produce the correctexplanation, but that it provides the best basis to start from. Only rarely, he says,has Occams Razor failed him. This is sound advice in the face of our human abilityto make too much out of too little, as Jones [12] subsequently points out withCrabtrees Bludgeon: No set of mutually inconsistent observations can exist forwhich some human intellect cannot conceive a coherent explanation, howevercomplicated.
Barton Whaley briefly addressed the question of counterdeception in his famous1969 book. In the chapter where he describes his theory of stratagem, he also pro-poses a decision-making model analogous to the one he describes for stratagem itself(see Section 2.2.2). Whereas a stratagem decision-making model is used to create aset of signals that the target observes and fits to a plausible alternative, acounterdeception decision-making model should be designed [13] to analyze thesignals of stratagem rather than the one designed to synthesize their false signals.Whaley offers two examples only one of which we will discuss here. While intelli-gence analysts consistently strive to expose an adversarys attempts at camouflage,Whaley observes that he could find no example of where the deceivers attempts atcamouflage were reported [13] for their own sake. Whaley concludes that [13],having done their work to identify camouflage, the analyst uses his findings only tocorrect the regular situation reports, order-of-battle maps, or traffic analysis studies.He does not use these findings to analyze the patterns of camouflage or noise to seeif they could imply a positive deception plan or campaign. In other words, the exis-tence of camouflage becomes a signal of deception and such signals can be analyzedin order to detect patterns that might suggest the alternative objective of theadversary.
Harris, who, according to Whaley, coined the term counterdeception in 1968[14], proposes that countering deception involves three related concepts [5]:
The detection of an adversarys deceptions; The adoption of countermeasures that reduce the likelihood and adverse con-
sequences of those deceptions; The coordination of both of these into a counterdeception system.
Harris concentrates primarily on the first two concepts and leaves it to thereader to read between the lines in order to identify the organizational implica-tions of creating a system to coordinate the two activities. Therefore, we will con-
146 Principles of Counterdeception
-
centrate on the three techniques that Harris describes for detecting the existence ofdeception operations and for uncovering the facts. These are: reconstructive infer-ence, incongruity testing, and vulnerability assessment (see Figure 5.1). The first,reconstructive inference, involves attending to the patterns of misleading and decep-tive signals that are transmitted by the deceiver. These spurious signals, or sprignals,appear to be directly analogous to the signals of stratagem that Whaley suggestedlooking for and therefore reconstructive inferenceanalyzing patterns of sprignalsand their relationshipsshould make it possible to identify the stratagemic plansof an adversary. The analysis of sprignals also makes it possible to identify thosechannels that that are most likely to be used to disseminate disinformation at criticaltimes. It may also be possible to correlate masses of sprignals with different decep-tion styles. Of course, separating sprignals from real signals and noise is no easierthan separating signals for noise and Harris suggests concentrating on separatingsprignals from signals while recognizing the fact that some noise will wind up con-taminating both categories. Sprignals are also likely to be sensitive to both time andcontext. Making things even more difficult, patterns of sprignals may provide cluesthat are only relevant to past encounters but not necessarily future ones. In addition,even if sprignals analysis yields insights into an adversarys initial plan, that analysismight not be relevant in a fluid situation (e.g., situations where a commanderchanges his plans and the deception plan winds up becoming the real plan).
The second technique is incongruity testing, which Harris defines as the match-ing and testing of alternative patterns for internal and interpattern consistency. Hedoes not offer much detail regarding the methods for such testing, but simply statesthat [15]: at least theoretically incongruities could be discovered given sufficientdata and hypothesis testing. Reading through Harriss section, one comes to the
5.2 The Search for Ways to Counter Deception 147
Reconstructiveinference
Incongruitytesting
Vulnerabilityassessment
Detection of spurioussignals (sprignals) thatindicate the presence ofdeception.
Alternative patternmatching and testing forinternal and interpatternconsistency.
Predict the likelihood ofdeception as a function ofmodes and conditions,based on simulation,historical data, and gain tothe deceiver.
Detectsprignals
Detectincongruities
Computelikelihoods
HistoricaldataSimulation
Dec
eptio
nde
tect
ed
Method Function Operation
Sour
ceda
ta
Figure 5.1 Harriss deception detection techniques.
-
conclusion that it involves the generation of alternative hypotheses that representalternative perceptual patterns of the signal and sprignal data.
Harris notes that incongruity testing faces two main limitations: disjointedincongruities and false incongruities. Disjointed incongruities [16] involve incon-sistencies that have become, in the perceptions of the viewer if not in fact, separatedor mismatched. The incongruities are not recognized because the different sets ofinconsistent patterns are never paired. Jokes, hoaxes, and deceptions all rely on dis-jointed incongruities. Jokes are funny because the question sets up an incongruityand the punch line reveals the hidden resolution to the incongruitya surprisingalternate interpretation [17]. Deceptions work if the incongruities between the realand false situations are not resolved. Harris calls these clandestinely disjointedincongruities. The challenge to incongruity testers is that deception planners do theirbest to prevent their target from detecting the incongruities. Another form of dis-jointed incongruity that has counterdeception implications are mutually disjointedincongruities. In this case, observer A perceives situation A and observer B perceivessituation B. It may be the case that situations A and B are representations of the samesituation, but unfortunately this situation is not the true situation, C.
The other limitation to incongruity testing is the need to deal with false incon-gruities. Harris defines false incongruities as [16]: The pairing of two or moreapparently inconsistent patterns that represent a consistent underlying reality.These complicate the task of incongruity testing by adding clutter to the process.They can result from different perspectives of the underlying pattern or as the resultof our ability to detect order in random patterns.
Some of these apparent but unreal incongruities are a matter of different per-spectives; some are a consequence of the random distribution of noise in perceptualsystems. In either case, they must be identified.
The third technique, vulnerability assessment, uses statistical approaches to pre-dict future vulnerabilities to deception. Bayes theorem, multivariate statistical anal-ysis, game theory, and other modeling and simulation methods can all be used toexplore the likelihood of encountering deception in different situations and undervarious conditions. Likewise, these methods can be used to assess the risks and costsof making Type I (failure to detect the deception) or Type II (false positive) errors. Inaddition, Harris suggest that rigorous studies of the deception styles and practices ofprospective adversaries can help assess both ones own potential vulnerabilities aswell as provide potential indicators of deception through the reconstruction ofsprignal patterns.
In 1976, Robert Jervis concluded his book, Perception and Misperception inInternational Politics, with a chapter on minimizing misperception [18]. Althoughhis focus was on misperception, not deception, his suggestions for minimizingmisperception are equally applicable to situations where deliberate deception isinvolved. Jervis suggests four broad themes for compensating for perceptual errors:
Making assumptions and predictions explicit; The use of devils advocates; Making sure that identities and missions do not become linked with specific
theories and images; Awareness of common misperceptions.
148 Principles of Counterdeception
-
When considering deception, the first theme might be restated as makingassumptions, preconceptions, and beliefs explicit. Jervis writes [19]:
The failure to examine the plausibility of crucial beliefs, especially those relating toends and means, is evident when the environment changes in a way that should, butdoes not, lead to changes in beliefs and policies. For example, one reason why theUnited States was taken by surprise at Pearl Harbor was that the initial analysis ofJapans alternatives had led to the reasonable conclusion that Japan would notattack American territory. But as the situation changed, American decision makersrealized that Japan might strike at the Philippines. Since such an attack meant warwith the United States, Americans should have noted that one of the major reasonswhy the Japanese would not attack Pearl Harbor was now removed and shouldhave looked at the dangers again.
Jervis states that making beliefs and assumptions explicit requires not onlyunderstanding the elements that make up those beliefs and assumptions but also anexamination of what evidence would confirm or disconfirm them. He suggests that[20], If they are aware of what they expect, or rather what their images and beliefsshould lead them to expect, actors will be more apt to heed unsettling incidents.Such awareness also extends to thinking about what events are excluded by theactors assumptions and beliefs with the hope that he would be more likely to noticeand react to those events if they occur as well as heighten his sensitivity to discrepantinformation.
Jervis [21] uses the concept of devils advocates to emphasize the need forencouraging the formulation and application of alternative images noting that itis often politically and psychologically difficult for any one person to consider mul-tiple alternatives. Jervis also has an interesting perspective on cognitive bias inthis regard [22]: Rather than seeking unbiased treatments of the data, decision-makers should seek to structure conflicting cognitive biases into the decision mak-ing process to help themselves maintain their intellectual freedom. In other words,instead of trying to eliminate cognitive biases altogether, decision-makers shouldtake advantage of them in order to produce differing perspectives of a given situa-tion. In that same vein, Jervis continues [22], To make it more likely that they willconsider alternative explanations of specific bits of data and think more carefullyabout the beliefs and images that underlie their policies, they should employdevilsor rather devilsadvocates. Jervis admits that a devils advocate isunlikely to produce the correct image; however, incorporating devils advocacy intothe process has two major benefits. First, it exposes decision-makers to alternativeexplanations of events, thereby forcing them to exercise judgment as opposed toseeing one view as the only possible alternative. Second, devils advocacy helps toexpose the assumptions and beliefs discussed earlier.
Jervis offers less detail regarding the last two themes. The third theme addressesthe potential dangers that arise when the mission and identity of individuals andorganizations becomes too closely tied to specific theories and images of otheractors. He cites as an example the U.S. Air Forces postWorld War II resistance toguided missiles [23]: The members of the organization had come to see its distinc-tive mission not as carrying out strategic bombardment, but as carrying out strate-gic bombardment by means of manned bombers. The deception implications
5.2 The Search for Ways to Counter Deception 149
-
should be obvious. If mission and identity are too closely tied to specific beliefs, adeceiver can manipulate those beliefs knowing that it is likely that informationabout other alternatives will be made to fit those beliefs or will not even be consid-ered. Finally, Jervis [24] concludes his chapter on minimizing misperception with ageneral call for decision makers to take account of the ways in which the processesof perception lead to common errors. The hope is that if decision-makers are awareof these biases and errors, they will be more likely to take measures to decreasemisperception by avoiding or compensating for common perceptual errors, decreasetheir overconfidence in prevailing beliefs, become more sensitive to alternative per-spectives, and perhaps reduce the amount discrepant information required to makethem reconsider those beliefs.
5.2.2 The Theoretical Work of the 1980s
As we saw in Chapter 2, the 1980s saw the publication of a number of journal arti-cles and books marking the start of a more theoretical approach to the study ofdeception. In 1980, the CIAs Mathtech Deception Research Program published areport, Deception Maxims: Fact and Folklore, which described the 10 deceptionmaxims summarized previously in Figure 2.7. That report also addressed thecounterdeception implications for three of those maxims in a single paragraph at theend of their report. Maxim 1 states that it is easier for the target to maintain preex-isting beliefs even in the face of evidence that contradicts those beliefs implying thatit is important to examine ones own beliefs for exploitable weaknesses in order tobe less susceptible to deception. Maxim 4, Jones Lemma, suggests that the deceivershould try to control as many of the channels available to the target as possible. Thecounterdeception implication is that the target should not rely on only one or twochannels of information but should employ redundant sensors to increase the likeli-hood that incongruities can be detected. Finally, Maxim 6 counsels the deceiver thatthere are situations where deception assets should be husbanded until they can beput to more fruitful use. The implication of this maxim then is for the target to con-sider the stakes involved in any situation when evaluating the adversarys options:higher stakes may warrant the adversary using those husbanded deception assets.
Shortly thereafter, Richards Heuer published his landmark article, StrategicDeception and Counterdeception: A Cognitive Process Approach. Although hisarticle dealt primarily with the cognitive biases relevant to the problem of deception,Heuer also addresses the subject of counterdeception by reviewing three commonlyadvocated approaches and suggesting two more approaches of his own. The firstthree approaches are:
Improved intelligence collection; Increased alertness to deception; Weighting of tactical indicators.
With regard to improved intelligence collection, Heuer notes [25] that advancesin technical collection systems have improved the intelligence communitys overallcapabilities but that such systems have contributed little toward improvingestimates of intentions, strategy, or political dynamics. While improvements in
150 Principles of Counterdeception
-
intelligence collection are desirable, Heuer offers his belief [25] that such improve-ments are unlikely to significantly reduce ones vulnerability to deception and goeson to state, Any systematic counterdeception program must focus primarily onproblems of analysis, only secondarily on collection. Ideally, increased alertness todeception would stimulate a more thorough review of the information available andHeuer concedes that this is possibly the case if the possibility of deception has notalready been considered. In such a case, Heuer [26] notes that simply focusing onthis possibility may be sufficient to identify overlooked information or prompt achange in analytical perspective. Nevertheless, he is generally pessimistic about theability of alertness alone to detect deception and makes the case that such alertnessis more likely to detect deception where it does not exist, lead analysts to be overlyskeptical of all the information on hand, and when deception is present, cause ana-lysts to dismiss the wrong evidence. The weighting of tactical indicators approach isbased on Abraham Ben-Zvis [27] study of surprise military attacks. Ben-Zvi foundthat tactical indicators of an impending attack were often discounted because theydid not agree with the preconceptions and strategic assumptions held by analystsand commanders. Although Heuer agrees that analysts and decision-makers shouldbe more open to changing their minds in the fact of discrepant information, givingmore weight to such indicators will increase the false alarm rate and it is oftendifficult or impossible to know whether in any given situation it is better to heed theindicators or hold on to the established view.
Heuers own suggestions fall into two categories: cognitive aids to analysis andorganizational measures. The first category consists of alternative hypotheses andbreaking mental sets. What has become to be known as Alternative CompetingHypotheses (ACH) is in response to the fact that research shows that people do apoor job of generating a sufficiently full set of hypotheses when analyzing a situa-tion. As Heuer notes [28], If the correct hypothesis is not even formulated for con-sideration, there is clearly little chance of making an accurate judgment. Thisfailure to generate sufficient hypotheses is aggravated by other biases such as confir-mation bias. Evidence tends to be evaluated in terms of how well it supports ahypothesis and the fact that such evidence may be consistent with other alternativehypotheses is often overlooked. For Heuer [28], The systematic identification,examination, and testing of alternative hypotheses is one of the keys to the success-ful uncovering of deception. We will examine ACH in more detail in later chap-ters. Heuer [29] also proposes that methods for breaking mental sets areparticularly relevant for counterdeception analysis. He suggests methods such asthe devils advocate, interdisciplinary brainstorming, and other techniques thatfacilitate the identification and systematic analysis of alternative perspective [29].The organizational measures that Heuer proposes focus primarily on the creation ofa counterdeception staff as a form of deception insurance. Heuer bases this sug-gestion on research showing that one of the most difficult cognitive tasks that a per-son can be called upon to perform is to reorganize information that they are alreadyfamiliar with in order to view it from a totally different perspective. The more com-plex the information and the longer that one has held certain beliefs about what theinformation means, the more difficult this task becomes. Heuer suggests that adedicated counterdeception staff is necessary to address complex questions
5.2 The Search for Ways to Counter Deception 151
-
concerning deception that cannot be handled by an individual analyst usingcognitive aids.
In 1982 two books were published that addressed deception and surprise: Stra-tegic Military Deception, edited by Daniel and Herbig, and Military Deception andStrategic Surprise, edited by Gooch and Perlmutter. In Strategic Military Deception,Daniel and Herbig [30] describe two groups of factors that influence the likelihoodof deception. The first group is related to the different situations that confront awould-be deceiver, while the second group reflects certain characteristics of thedeceiver. Although Daniel and Herbig do not mention that these factors could beused for counterdeception, we suggest that these factors represent potentially usefulcues for analysts and decision-makers to be aware of. The situational factorsinclude:
High-stakes situations. Such situations encourage an adversary to use everycapability at his disposal to ensure success or avoid defeat.
Lack of confidence in a situations outcome due to military weakness. Decep-tion is a useful way of compensating for an adversarys superior strength.
Lower the costs of even an optimistic situation. In addition to using deceptionin order to avoid human and material losses, an adversary may employ decep-tion in order to avoid the political and economic costs of being viewed as anaggressor.
Uncertain situations. A deceiver may use deception in order to keep hisoptions open and to test his adversarys reactions to different actions.
The second group consists of factors related to the deceivers previous condi-tioning or personal predilection and includes:
Cultural norms. Cultural factor may affect when and how deception is used. Political leaders that play a strong, central role in military decisions. Decep-
tion may be more common in situations where this is the case, particularly indictatorships and authoritarian regimes.
Bureaucratic and psychological pressure. This factor is based on two traitscommon to many bureaucracies. The first trait is that organizations trainedfor particular tasks will seek to perform them. The second trait is related to theavailability heuristicpeople tend to think in terms of what is available tothem. The first trait implies that an adversary that maintains the capability toplan, organize, and execute deception operations is more likely to use decep-tion than one that is not. The second trait suggests that, even if not incorpo-rated formally as doctrine, an adversary that has some familiarity withdeception is more likely to use it than one that is not.
Personal predilection. Leaders and commanders who appreciate deceptionand have relied on in the past are likely to do so again.
Paul Mooses chapter in Strategic Military Deception [31] presents an elemen-tary systems model that envisions a dynamic relationship between two adversaries(Green and Purple) and their environment. This produces an event stream that is
152 Principles of Counterdeception
-
the result of each sides actions in response to the other as well as the environment.Mooses concept of counterdeception [31] involves a plan where the target (Purple)hypothesizes two measurably different near-term event streams, depending onwhether a deception is present or not and initiates activities that precipitate someaction on the deceivers (Green) part in one of these streams which may reveal thedeceivers real intentions. The target then uses his own feedback channels to observehow the deceiver reacts to the targets reaction. Of course, the risks of waiting whilethe counterdeception plan unfolds versus acting on one of the hypothesized eventstreams must be considered. Moose also provides some general prescriptionsregarding counterdeception. He states that [31], The most effective way to preventdeception is to be continually aware of ones vulnerabilities as a target. He alsonotes that one should be skeptical about signals that encourage procrastination orinactivity and that the leaky nature of the adversarys internal communications(i.e., unintentional signals that might reveal the adversarys true intentions) shouldbe exploited.
Also in Strategic Military Deception is a chapter by Theodore Sarbin, a narra-tive psychologist [32]. He proposes a theory of counterdeception that assumes that[33] human beings think, perceive, and imagine according to a narrative struc-ture. As we saw in Section 3.2.2.2, he suggests that the authors of strategy emplotnarratives and observes that [33], The task of the counterdeception analyst ofstrategy is like the task of the literary critic or the dramatic criticto fathom theintentions of the author, to understand, to decipher the meaning of the creativework. Given that deception typically represents a unique case where the context isa critical determinant of the actors behavior, the target of deception cannot rely onstatistical approaches or case studies (sagacity) to predict the deceivers real inten-tions due to the lack of meaningful base rates. Therefore, the counterdeceptionanalyst must rely on acumenthe empathic skill to take on the role of another.This ability is related to the persons ability to decenterthe ability to switch fromones own egocentric perspective and see things from anothers perspectiveandSarbin suggests that [34], From literary and autobiographical sources, one caninfer that the person who is successful in taking the role of another is able to con-struct a scenario, a story, and place himself in relation to the other features of thestory, physical features such as geography and climate, social features, such as rolerelationships with multiple role players. Such abilities help the person giftedwith acumen succeed in consistently predicting the actions of others and are thestock in trade of someone who can penetrate the masks or expose the lie of theadversary [34].
Acumen is therefore an important skill for intelligence and counterdeceptionanalysts to possess and Sarbin offers the hypothesis that analysts possessing the skillof acumen are more likely to identify the form of the narrative contained in the stra-tegic plans of an adversary. He also poses two interesting questions in this regard.Are certain kinds of plots related to kinds of ethnic origins or national heritages?Can acumen be taught and learned? Sarbin asserts that literary historians are able tosuccessfully identify the different forms of emplotment they encounter but admitsthat they have the benefit of hindsight. On the other hand, analysts face the problemof having to construct a plot from antecedent events and try to predict the outcomemaking their task tremendously more difficult. The difference is that [35], Unlike
5.2 The Search for Ways to Counter Deception 153
-
the historian who emplots a narrative about events that have occurred in the past,the analyst of strategy must emplot concurrent events, events that are not frozen butfluid. With regard to teaching and learning acumen, Sarbin suggests that there maybe ways to recognize optimal cognitive strategies for identifying the events associ-ated with a specific plot structure [36], that is, When is an event an event?
In the mid-1980s three closely related books by British authors (see the Haswell,Dewar, and Latimer portion of Section 2.2.3) appeared, but only one specificallyaddressed the topic of counterdeception. In The Art of Deception in Warfare, Dewardevotes a chapter to counterdeception [37], in which he summarizes Whaleys con-cepts of deception and uses them to make a number of points. These can be catego-rized into two broad areas:
Macroscopic knowledge of the adversary; Microscopic analysis aimed at discovering inconsistencies.
Dewar notes [38], A detailed knowledge of the enemy is as important in coun-tering deception as it is in achieving it. This knowledge must extend to [39] a mac-roscopic appreciation of the enemys fears, aims, prejudices, and habits andanalysts must also be able to see things from the enemys point of view, think as theenemy thinks, list the options open to him and decide what is most probable [40].At one point Dewar goes as far as stating [41], Thus the main, almost the only,weapon of the deception analyst is to put himself in the mind of the deceiver. Thisknowledge includes recognizing that the adversarys deception plans are themselvesseldom flawless thus creating the opportunity for the microscopic search for theflaws in the pattern of the deceivers deception plan. Here Dewar seems to be advo-cating a kind of analytical preparedness [42], Defence against deception thereforerequires a sustained questioning of evidence, a search for its corroboration and areview of previous deductions as fresh evidence is produced. In particular, it is help-ful to look for small and obscure clues which are missing and which would prove ordisprove the authenticity of the existing evidence. For Dewar, the golden rule ofcounterdeception is to avoid jumping to conclusions. He warns that deceivers thriveon the pressure that analysts labor under to provide timely assessments and predic-tions and urges analysts to resist the temptation to jump to conclusions wheneverpossible.
Dewar acknowledges the difficulty of looking at a situation from different per-spectives noting that increased alertness to the potential for deception is largelyineffective, but suggests that a devils advocate is one way that the data can be sub-jected to competitive analysis. Dewar summarizes his approach to counterdeceptionby reminding analysts that first impressions are difficult to change and differentstarting points lead to different conclusions and concludes [43]: That is why com-petitive analysis should be undertaken whenever possible. Or to put it more simply,two heads are better than one.
The end of the 1980s saw the publication of Michael Handels War, Strategy,and Intelligence. This book includes work that appeared as journal articles or aschapters in other books (e.g., Gooch and Perlmutter) and several of these address thetopic of counterdeception both directly and indirectly. Handel is strongly pessimisticwith regard to the possibility of preventing or forestalling surprise attack and this
154 Principles of Counterdeception
-
pessimism is reflected in his general skepticism regarding counterdeception. Never-theless, he offers six possible deception countermeasures, noting that [44], Somecan be suggested, although their effectiveness cannot be guaranteed. The first sug-gestion, avoid overreliance on one source of information, emphasizes that poten-tially valuable information collected in one channel should be independentlyverified by sources in other channels. As we saw in Chapter 3, German over relianceon one channel of information (their network of agents in Britainall controlled bythe Allies) was a major factor in the success of Allied deception efforts. The nextfour suggestions address what can perhaps be the most important channel of infor-mation available to the deceiverthe double agent. His suggestions reflect severallessons that can be drawn from Germanys experience as the target of the AlliesDouble Cross operations in support of the Fortitude deception. These include:
Never rely exclusively on nonmaterial evidence. Handel quotes Clausewitzsremark that [44], Words being cheap, are the most common means of creat-ing false impressions. In other words, written or verbal information that anagent provides about physical entities must be checked and verified by othersources (e.g., an agent report about the location of a missile battery should beverified by imagery or signals intelligence). This suggestion also applies toinformation obtained through communications intercepts.
Never rely on agents who have not been seen or directly interviewed. Much ofthe success of the FORTITIDE SOUTH deception is credited to the doubleagent GARBO and much of GARBOs success as a double agent was due tohis ability to convince the Germans that he controlled a network of subagents.Unfortunately for the Abwehr and the German high command, this entire net-work was fictitious. All of GARBOs subagents, including his deputy, werenotional. Handel [44] notes that this suggestion carries even more weight ifthe information that is received from possibly notional agents dovetailsnicely with ones own preferences or needs, or when it fits without contradic-tions into the reports of other possibly notional agents.
Check and double-check discrepancies in agent reporting. Handel suggeststhat there are two situations where extra caution should be exercised whenrelying on agent reporting. First, there is the situation in which an agentsreports initially appear to be correct but then turn out to be wrong on animportant issue and yet somehow the agent always seems have a good expla-nation for each discrepancy. The second situation calls for even more caution.Here Handel even suggests a special investigation of any agent who supplieshigh quality information of the greatest importance [44, 45], but only whenit is too late to be of any useeven if it arrives before the action it warnsagainst has taken place.
Controllers of agents should also be encouraged to heed more closely theopinions of lower-level intelligence analysts. Since the target of most strategicdeception operations are top-level decision-makers, commanders, and intelli-gence managers, Handel suggests that deception has a better chance of beingdetected by lower level (not necessarily lower expertise or experience) analystssince they are less likely to be biased by any specific strategy, wishful thinking,or political interests. Handel cites examples from World War I, World War II,
5.2 The Search for Ways to Counter Deception 155
-
and the 1973 Yom Kippur War noting that many of the negative or unpleas-ant conclusions reached by lower level analyst were often ignored [46].
Handels sixth suggestion makes it clear that it is necessary to know the adver-sarys limitations as well as his capabilities. This suggestion has its roots in mirrorimaging and ethnocentric biases. The failure to analyze information about an adver-sarys capabilities and intentions must be done in accordance with the adversaryspolitical and military needsnot ones own. Projecting ones own preferences, fears,and doctrine onto the adversary only increases the likelihood that one will bedeceived or surprised.
Handel provides other direct references to the problems associated withcounterdeception using a puzzle metaphor. For example [47], Under certain cir-cumstances, the more perfectly an intelligence puzzle fits together, the greater thedanger of a possible deception ploy. This is particularly true when informationthesolution to an important and complex intelligence puzzleis received in the absenceof much noise or contradictory evidence, and when the resulting conclusions con-form neatly to ones hopes and expectations. Other precautions for avoidingdeception are related to anticipating surprise attack and include asking [48]: whatare the most likely directions from which an adversary might attack, even if theavailable evidence contradicts these contingencies.
Handels writings on strategic surprise and intelligence also indirectly addressimportant counterdeception issues. For example, Handel discusses the roles thatpreconceptions, ethnocentrism, and misperception play in the problem of strategicsurprise, and he attributes perceptual errors [49] to projecting ones own culture,ideological beliefs, military doctrine, and expectations on the adversary (i.e., seeinghim as a mirror image of oneself) or of wishful thinking. To counter theseethnocentric biases, Handel makes the general suggestion of know thine enemy,that is, develop a thorough and in-depth knowledge of an adversarys language, cul-ture, politics, and values, as well as devoting more time and resources to knowingthyself.
In addition, Handel discusses two mechanisms that are related to the subject ofimproving the objectivity and variety of input into the intelligence process. Thesemechanisms are also relevant to the challenge of countering deception. The first ismultiple advocacy. The idea behind this concept is that multiple, independent intelli-gence agencies do a better job of providing decision-makers with a wider spectrumof views than does a single, centralized intelligence organization. The pros and consof multiple advocacy are beyond the scope of this chapter; however, the contribu-tion that it makes to counterdeception is to counteract a number of factors that tendto make the deceivers job easier (e.g., the tendency to jump to conclusions andgroupthink). The second mechanism is the devils advocate. The purpose of a devilsadvocate is to help ensure that dissenting, possibly unpopular, opinions are heardand evaluated. Again, the pros and cons of devils advocacy are outside the scope ofthis chapter, but it is interesting to imagine what the results might have been in May1940 if the French had an effective devils advocate to warn them of the possibilityof a German offensive through the Ardennes.
The end of the 1980s also saw the end of the Cold War, and as we notedin Chapter 2, deception research entered a hiatus period that was to last until the
156 Principles of Counterdeception
-
revelations of Operation Desert Storm and other events like the Indian nuclear testin 1998 made it clear that the need for understanding deception and improvingways to counter it had not disappeared. Interest in deception surfaced once againand has resulted in new practical and theoretical work by both some familiar earlyresearchers and some new faces.
5.2.3 Current Directions
On the practical side, there is the 1997 CIA release of a set of analytic tradecraftnotes that are a standard counterdeception reference for analysts both inside andoutside of the CIA [50, 51]. Note 10, Tradecraft and Counterintelligence beginswith an admonition to analysts to show increased respect for the deceivers abilityto manipulate perception and judgments and then describes two sets of warningsigns that signal the possible existence of a deception operation. The first set goes bythe acronym MOM, which stands for means, opportunity, and motive, andaddresses the likelihood that a potential adversary is deliberately trying to distortthe analysts perceptions. Means addresses the adversarys experience and capabili-ties with regard to planning and executing sophisticated deception operations,while opportunity is related to the sources (channels) of intelligence available to theanalyst. If the adversary is known to have knowledge of a source (e.g., a technicalcollection system), then he may likely have the opportunity to conceal informationfrom that source or to deliberately distort the information the source collects.Finally, does the adversary have a motive to use deception? If all three warning signsare present, the analyst is wise to suspect that an adversary may resort to deceptionin order to achieve his goals.
The second set of warning signs focus on anomalies that analysts should be onthe look out for regarding what they know, how they know it, and what they dontknow. These warning signs include suspicious gaps in collection, contradictions tocarefully researched patterns, and suspicious confirmations. Gaps in collection canbe considered suspicious when information received through one channel is notsupported by other channels especially when such confirmation would be consid-ered normal. If new information contradicts well-supported trends and patterns,analysts need to critically examine such new information if it signals inexplicablechange in the adversarys priorities, behaviors, and practices. Information receivedfrom one or more sources that seem to conveniently reinforce the rationale for oragainst ones own strategy or policy might also be considered suspicious. In suchcases, the fact that multiple sources seem to corroborate one another may notnecessarily mean the information is authentic.
Finally, Note 10 offers analytical tradecraft tips for dealing with the risk ofdeception when making intelligence assessments on complex issues. In the case ofregular issues (those where there is no specific reason to suspect deception), theanalyst is advised to employ a two-step process as insurance against the risk of decep-tion. The first step is to organize information important to his conclusions and thencritically examine it using the six warning signs mentioned previously. The secondstep calls for the analyst to play the role of devils advocate and develop a hypotheti-cal argument that deception is in fact taking place. In the case of suspect and sensi-tive issues, the note recommends undertaking an even more in-depth evaluation of
5.2 The Search for Ways to Counter Deception 157
-
the information at hand and annotating any resulting reports with a text box orannex that conveys to the reader that the possibility of deception has been consideredseriously, appropriate analytic testing to determine the likelihood of deception hasbeen done, and any reasonable doubts about the resulting analysis are noted.
The scientific communitys interest in deception, which had been primarilyfocused on lying and deceit, also began to attract attention in national security cir-cles in the 1990s. For example, Johnson et al. [52] have investigated the processesused by accounting auditors to detect the fraudulent manipulation of information infinancial statements. They use Whaleys model of deception (i.e., masking, repack-aging, dazzling, mimicking, inventing, and decoying) as the basis for the tactics thata deceiver can use to manipulate the targets processes of searching, processing, andinterpreting information. They then propose a process for detecting deception thatconsists of three components:
First the deception target identifies inconsistencies between his observationsand his expectations for the observations.
The target then determines that those inconsistencies are functional to thegoals of the deceiver.
Finally, the deception target identifies the potential actions of the deceiver thatcan be associated with one or more deception tactics and assesses thedeceivers ability to create the observed inconsistencies.
They then develop a competence model based on this process for detect-ing financial statement fraud. This model (see Figure 5.2) consists of four steps:
158 Principles of Counterdeception
Activation
Compare cues andexpectations
Activation
Compare cues andexpectations
Hypothesis
Evaluation
Assess impact
Evaluate materiality
Hypothesisevaluation
Assess impact
Evaluate materiality
Hypothesis
GenerationApply detection
tactics
Hypothesisgeneration
Apply detectiontactics
Global
Assessment
Aggregate
Global assessment
Aggregate
Cues(financial
statements)
Inconsistencies
Initialhypotheses
Materialhypotheses
Diagnosticoutcome
Figure 5.2 Fraud detection method. (From: [52]. 2001 Cognitive Science Society. Reprinted withpermission.)
-
activation, hypothesis generation, hypothesis evaluation, and global assessment.The activation step produces the expectations for the values of the various cues thatmight be found when an auditor examines a financial statement (e.g., an inventorybalance value). These expectations are then compared to the actual observed values.Observed values that exceed expectations by some amount are then labeled asinconsistencies. The next step is to generate a set of hypotheses that explain theinconsistencies. In the auditing process examined by Johnson and his team, there arethree possible hypotheses: accounting errors, insufficient disclosure, and, of course,deception. In this model, the deception hypothesis is generated when the inconsis-tencies satisfy the additional conditions of the detection process described earlier(i.e., functionality and feasibility).
In step three, the hypotheses are evaluated on the basis of their materiality.Materiality is an accounting term that is defined as [53] the magnitude of anomission or misstatement of accounting information that, in the light of surround-ing circumstances, makes it probable that the judgment of a reasonable person rely-ing on the information would have been changed or influenced by the omission ormisstatement. The error and disclosure hypotheses are evaluated primarily on themagnitude of the difference between the expected and observed values of the finan-cial statement cue. The basis for evaluating materiality for the deception hypothesisthough depends on the deception tactic that is suspected to have been used (e.g., ifthe repackaging deception tactic is suspected, then the items that have been deliber-ately miscategorized should be recategorized using a worst-case scenario assump-tion). The global assessment step aggregates and evaluates confirmed hypothesesto produce the final rating of the companys financial statementunqualified (thestatement is clean), unqualified+ (the auditor adds a paragraph noting a lack ofconsistency or some other concerns or uncertainty), or misleading. Finally, themodel was implemented as a computer program that uses various data from finan-cial statements to produce the final rating. The program successfully issued thecorrect global assessment rating for each of six cases it was given.
Even while counterdeception was attracting the interest of new researchers likeJohnson and his associates, some familiar names were still active in field. It shouldprobably not be a surprise that, over 30 years later, Barton Whaley was still active,contributing two chapters (one with magician Jeff Busby) to a book, Strategic Denialand Deception: The Twenty-First Century Challenge [54]. In the chapter coauthoredwith Busby, Whaley focuses entirely on counterdeception. Whereas many authors inthe field of strategic deception are quite pessimistic about the prospects of successfulcounterdeception, Whaley (the principal author) offers a surprisingly optimistic per-spective on the topic [55]: I am optimistic that deceptions can be detected regardlessof the field in which they occur. In theory, deception can always be detected, and inpractice often detected, sometimes even easily. He proposes a general theory ofcounterdeception based on a wide range of sources, including results from case stud-ies of different types of professionals who regularly deal with deception (e.g., intelli-gence analysts, police detectives, forensic scientists, art experts, and magicians).Whaley found that all the professionals who were highly successful at detectingdeception used a common set of methods and that those same methods were never oronly infrequently used by those who did poorly. In addition, he found these methodsto be largely intellectual rather than technological in nature. Technology in the form
5.2 The Search for Ways to Counter Deception 159
-
of sensors that extend our natural senses and information technology that extendsour ability to recall and manipulate information is important; however, deceptiondetection remains a subtle, human intellectual process, as we will see when we exam-ine the different elements of his theory.
This theory of counterdeception starts with two general components (see Figure5.3): a taxonomy of detectables and the Plus-Minus Rule. Five of the nine catego-ries of detectables (intention, time, place, strength, and style) have their origins asmodes of surprise in Whaleys original work on stratagem [56]. The remaining four(pattern, players, payoff, and channel) make their appearance in Bell and WhaleysCheating and Deception [57] in 1991. Together they represent the set of things thatthe deceiver will either conceal or reveal and they provide the counterdeception ana-lyst with a checklist for the kinds of questions that must be considered when tryingto determine the existence and form of a deception operation. The Plus-Minus Rule,on the other hand, is the cornerstone of their theory. This rule is based on the fact(noted by R. V. Jones in 1942) that [58], No imitation can be perfect without beingthe real thing. Therefore, even though the imitation may share many of the charac-teristics of the original, it must lack at least one characteristic marking the originaland it will often have at least one characteristic that the original does not possess.According to Whaley [59], If either a plus (added) or a minus (missing) characteris-tic is detected, the imitation stands revealed. Note that a most important corollary ofthis rule is that the detective need not discover all the discrepancies or incongruities,a single false characteristic, whether plus (added) or (minus) is quite enough to provethe fakery.
Whaley is quick to note, however, that the Plus-Minus Rule demands total cer-tainty about the added or missing characteristic and that while this is always possible,it is seldom likely in the real world. With this in mind, the next components of theircounterdeception theory can be thought of as applied theory suitable to decisionmaking under uncertainty. The first element, the Congruity-Incongruity Rule, flowsfrom the Plus-Minus Rule and appears to be based on the results of Whaleys casestudies of deception detection professionals. He found that these professionals clus-tered into two groups: congruity testers (e.g., scientists, internists, and historians) andincongruity testers (e.g., police detectives, interrogators, trial lawyers, and forensicpathologists). In the Congruity-Incongruity Rule, the emphasis is obviously all onincongruities [60]: Every deception operation necessarily leaves at least two clues:incongruities about what is hidden; and incongruities about what is displayed in itsstead and because neither simulation nor dissimulation can ever be done flaw-lessly, however, their detection also is always possible. In other words, discrepancies(incongruent clues) inevitably suggest alternative patterns (hypotheses) that them-selves are incongruent (discrepant, anomalous, paradoxical) at some point with real-ity. In other words, detecting incongruities is the key to detecting deception.
The next several elements of the Busby and Whaley theory represent a portfolioof methods applicable to detecting deception:
Locards Exchange Principle. Although normally associated with physical evi-dence, Whaley suggests it can also be applied to deception by adding psycho-logical perceptions to the principle. Unfortunately, he does not offer anyinsights into how these perceptions are to be added.
160 Principles of Counterdeception
-
5.2 The Search for Ways to Counter Deception 161
Des
crip
tion
Decisionmakingunderuncertainty
Cat
egor
ies
ofde
tect
able
s
Real
entit
ies
are
com
ple
tely
cong
ruen
tw
ithal
l of t
heir
char
acte
ristic
s;th
eref
ore,
ever
yfa
lse
entit
yw
illdi
spla
yat
leas
ton
ein
cong
ruity
.
Ap
erp
etra
tor
alw
ays
leav
esso
me
phy
sica
l evi
denc
eat
the
crim
esc
ene
and
alw
ays
take
sso
me
away
.
Itis
alw
ays
pos
sibl
eto
find
aw
ayto
verif
ya
hyp
othe
sis.
Genearl
theory
Mul
tiple
sens
ors
will
alm
ost
alw
ays
pro
vem
ore
effe
ctiv
eth
ana
sing
leon
e,ev
enw
hen
each
isle
ssp
reci
se.
The
abili
tyto
not
only
disc
over
the
mea
ning
ofch
ance
even
tsbu
tto
also
mak
eef
fect
ive
use
ofth
atkn
owle
dge
dep
ends
who
llyon
syst
emat
icm
enta
l pre
par
atio
n.Su
chm
enta
l pre
par
atio
nal
som
akes
intu
ition
pos
sibl
e.
The
goal
ofin
dire
ctth
inki
ngis
toco
me
upw
ithan
indi
rect
answ
er
the
third
optio
nth
atth
ead
vers
ary
was
not
exp
ectin
g.
The
esse
nce
ofth
eO
mbu
dsm
anM
etho
dis
tofo
rce
one
toco
nfro
ntst
raig
hton
that
nagg
ing,
alm
ost
subl
imin
al, s
ense
ofun
ease
abou
ta
situ
atio
nor
per
son
that
som
ehow
does
not
seem
qui
terig
ht, t
hat
does
not
qui
tefit
asit
shou
ldth
ose
little
inco
ngru
ities
that
sign
ala
dece
ptio
nin
pro
gres
s[6
6,p
. 217
].
The
adve
rsar
ys
dece
ptio
np
lans
can
bedi
scov
ered
thro
ugh
the
use
ofp
enet
ratin
gth
ead
vers
ary
sor
gani
zatio
nw
ithhu
man
agen
tsor
tech
nica
l col
lect
ion
devi
ces,
and
disc
over
ing
the
dece
iver
sdo
uble
agen
tsw
ithin
one'
sow
nor
gani
zatio
ns.
espi
onag
e,co
unte
resp
iona
ge,
The
plu
s-m
inus
rule
The
cong
ruity
-inco
ngru
ityru
le
Loca
rds
exch
ange
prin
cip
le
Verif
icat
ion
The
law
ofm
ultip
lese
nsor
s
Pass
ive
and
activ
ede
tect
ion
Pred
etec
tion
The
pre
par
edm
ind
and
intu
ition
Indi
rect
thin
king
and
the
third
optio
n
Busb
ys
ombu
dsm
an
Pene
trat
ion
and
coun
tere
spio
nage
Dec
eptio
nm
aybe
dete
cted
byan
alys
is(p
assi
ve)
sup
por
ted
byac
tive
inte
rven
tion
aim
edat
colle
ctin
gm
issi
ngke
yfa
cts.
This
inte
rven
tion
take
sth
efo
rmof
defin
ing
new
colle
ctio
nre
qui
rem
ents
orby
runn
ing
cont
rolle
dex
per
imen
ts, i
nclu
ding
the
use
oftr
aps
and
trip
wire
s,to
tric
kth
ead
vers
ary
into
betr
ayin
ghi
mse
lf.
Pred
ictin
gan
adve
rsar
ys
dece
ptio
np
lans
byan
alys
isof
his
dece
ptiv
est
yle,
cap
abili
ties,
and
goal
s.
Asi
ngle
fals
ech
arac
teris
tic
eith
eron
eth
ere
alen
tity
does
not
pos
sess
(ap
lus)
oron
eit
lack
s(a
min
us)
is
suffi
cien
tto
pro
veth
een
tity
isfa
ke.
Patt
ern,
pla
yers
, int
entio
n,p
ayof
f,p
lace
, tim
e,st
reng
th, s
tyle
, and
chan
nel
Fig
ure
5.3
The
Busb
y-W
hale
yth
eory
ofco
unte
rdec
eptio
n.
-
Verification. Once the Congruity-Incongruity Rule, Locards Exchange Princi-ple, or some other method provides evidence of deception, Whaley suggeststhat it is always possible to find a means of verifying the deception hypothe-sis. Of course, the costs of doing so may be prohibitively high, but it could bedone.
The Law of Multiple Sensors. This law is based on the insights of R. V. Joneswho noted that [60], The ease of detecting counterfeits is much greater whendifferent channels of examination are used simultaneously. Whaley notesthat multiple sensors are almost always more effective than a single one andare also less vulnerable to countermeasures.
Passive and active detection. To Whaley [61], passive detection is synonymouswith the straightforward analysis of evidence and always leads to incon-clusive results unless all the key facts are available. Therefore, active detec-tion must be used to collect the missing facts. Active detection involves levyingnew collection requirements on the various INTs (e.g., HUMINT, IMINT,and SIGINT) or by running controlled experiments to provoke the adver-sary into creating new evidence that might reveal the deception.
Predetection. J. C. Masterman of World War II Double Cross fame was alsothe author of two detective novels. In the second of these, The Case of the FourFriends: A Diversion in Pre-Detection, the detective in the story, ErnestBrendel, is persuaded [62], to tell the tale of how he pre-constructed acrime, rather than reconstructing it in the detectives normal fashion. As hesays, To work out the crime before it is committed, to foresee how it will bearranged, and then to prevent it! Thats a triumph indeed, and is worth morethan all the convictions in the world. Whaley makes the connection thatpredetection is a method whereby an adversarys deception plans can be dis-cerned and defeated by analysis of the adversarys deception style, capabilities,and goals.
Penetration and counterespionage. Espionage is a powerful form of activedetection that can be used to penetrate the adversarys intelligence, military,and command organizations. A well-placed asset in the right place is all thatmay be needed to reveal the adversarys deception plans. Counterintelligence(CI) and counterespionage (CE), on the other hand, seek to identify and neu-tralize the adversarys intelligence collection efforts, especially agents whohave penetrated ones own organizations. CI and CE activities can cut offimportant paths through which the adversary obtains information about thetargets preconceptions and beliefs as well as the feedback needed to knowhow his deception operations are doing. In addition, CI and CE operations canreveal the existence of double agents being used as a channel for feeding theadversarys disinformation to the target.
The prepared mind and intuition. The prepared mind refers to a famous quo-tation by Louis Pasteur: Dans les champs de lobservation le hasard nefavorise que les esprits prepares. Pasteur made this comment at a lecturegiven at the University of Lille in December 1854. Translated into English, itmeans In the fields of observation, chance favors only the prepared mind, ormore succinctly, chance favors the prepared mind. The essence of Pasteurs
162 Principles of Counterdeception
-
remark is that the ability to recognize the significance of chance events andmake effective use of that knowledge depends wholly on systematic mentalpreparation. On the other hand, intuition is [63] our capacity for directknowledge, for immediate insight without observation or reason. It is thepolice detectives hunch, Irwin Rommels fingerspitzengefhl, or the scien-tists sudden awareness of the solution to a difficult problem while taking ashower. It is not unreasonable to think that accurate intuition is also a resultof the same systematic mental preparation associated with the prepared mind.
Indirect thinking and the third option. Whaley uses the term indirect thinkingin honor of B. H. Liddell Harts theory of the indirect approach to strategy[64]. The essence of this theory is to avoid direct confrontations with theenemy but instead upset his equilibriumkeeping him uncertain about thesituation and your intentionsand confront him with what he does notexpect and is therefore not prepared for. Such an approach often yields a thirdoptionone that the adversary was not expecting. The German advancethrough the Ardennes in 1940 is an excellent example of the indirect approachand the third option. The French expected a German attack against either theMaginot Line or through Belgium. Instead, the Germans came up with a thirdoptionthe attack through the Ardennes and the Battle of France was over injust 44 days. Whaley is suggesting that the purpose of indirect thinking is tocome up with an indirect answerthat third optionand that this abilityto envision options available to an adversary that would be otherwise hiddenor ignored is an essential method of counterdeception.
The final component of Whaleys overall theory of counterdeception is a spe-cific methodthe Jeff Busby Ombudsman Method. This method was developed byBusby in 1978 as a means of teaching casino employees to detect cheating withoutteaching them how to cheat at the games themselves. Whaley does not describe anyof the details of the Busby Method; however, it is apparent that it is based on look-ing for discrepancies, irrelevancies, and misdirection [65] as well as some indirectthinking. He does state that [66], The essence of the Ombudsman Method is toforce one to confront straight on that nagging, almost subliminal, sense of uneaseabout a situation or person that somehow does not seem quite right, that does notquite fit as it should those little incongruities that can signal a deception in prog-ress. Whaley suggests that the method seems the most promising of several sug-gested approaches for use in training analysts about deception as well as in theanalysis of both current and historical cases of deception.
In another chapter of Godson and Wirtzs book, Paul Rossa identifies severalkey issues germane to counterdeception [67]. The first of these affirms the secondprinciple of deception we proposed in Chapter 2: denial. Rossa notes that [68],Uncovering secrets also is key to exposing deceptions. Consequently, counteract-ing foreign denial efforts is critical to countering foreign denial and deception.Even identifying the deceivers denial operations helps the counterdeception effortby helping to task collection resources where they are most likely to do the mostgood. An adversarys efforts to conceal information about a subject can also suggestthe possibility that deception operations associated with the subject may also exist,thereby by affecting how all information on that subject is interpreted.
5.2 The Search for Ways to Counter Deception 163
-
Other issues are related to recognizing the existence of a deception operation.Rossa points out that determining what information the analyst decides to use, dis-card, or reshape is hard enough, but it is even more difficult when deception isinvolved. Rossa makes the point that recognizing the existence of a deceptionoperation depends heavily on the metadata (data about data) that is available.Examples of metadata include the way in which the data was acquired and the cir-cumstances surrounding the acquisition. The metadata, along with the content ofthe information, may provide some hints regarding the presence or absence of adeception operation. Unfortunately, these hints are all too often ambiguous orcontradictory.
Another important part of recognizing deception is paying close attention toinformation about a potential or suspected deceiver. What are his motives? Wouldthe use of deception increase the likelihood of achieving his objectives? Has theadversary demonstrated a strong predisposition to the use of deception in the past?Does he possess the knowledge and capabilities to mount an effective deceptionoperation?
Finally, Rossa addresses the issues of reducing ones own susceptibilities todeception. One of the most important factors affecting the deception targets suscep-tibility to deception is the extent of the adversarys knowledge about the targetsstrategies and methods of collecting intelligence, his analytic methodologies, andhow the resulting intelligence is used to form judgments and make decisions. Suchknowledge can come from a variety of sources, including the adversarys espionageoperations or the unauthorized disclosure of secret information [69]. Reducing thatsusceptibility depends on counterintelligence and counterespionage operations aswell as the development of information gathering and processing methods that areunknown to potential deceivers or that are difficult to manipulate. Nevertheless,such efforts do not entirely eliminate the risk of deception. Deception can still suc-ceed even when the deceivers information about the target is incomplete and secretintelligence collection and analysis methods may still be affected by deception oper-ations. Better analytic methods and tools can also contribute to reducing susceptibil-ity to deception. Rossa suggests that [70]: The intelligence community would profitby development of conceptual frameworks, indicators, and analytic techniques thathold promise for recognizing and countering foreign D&D as it occurs. He calls forqualitative and quantitative analysis of historical cases of deception and the need forD&D analysis to continue to evolve in order to keep pace with the issues andtechnologies associated with the postCold War world.
Scott Gerwehr and Russell Glenn represent a new generation of national secu-rity analysts whose focus is on strategic deception. In their report Unweaving theWeb, Gerwehr and Glenn also address counterdeception and hypothesize that [71],the most effective approaches to penetrating deception entail (1) combining morethan one category of counterdeception and (2) applying the right category ofcounterdeception. They then identify five categories of counterdeception, three ofwhich focus on defeating deception by emphasizing the collection and processing ofdata. The first three categories are:
The type or amount of data collected (e.g., using radar or hyperspectral sen-sors to defeat camouflage paints and netting).
164 Principles of Counterdeception
-
The methods for collecting data (i.e., the methods by which the sensors areemployed). For example, changes to collection plans or search plans may dis-rupt an adversarys attempts to conceal his movements.
The analysis of the data collected (for example, can alternative scenarios bedeveloped using the same data?)
The fourth category focuses on unmasking deception through the use of onesown deceptions. For example, a feint or demonstration might force concealed unitsto maneuver or engage. The final category consists of strategies for rendering theadversarys deceptions moot. This is often the U.S. militarys approach to counter-deception. For example, if an adversary has deployed numerous decoys among itsreal units (e.g., tanks or surface-to-air missile), the U.S. ability to employ over-whelming firepower makes it possible to target all potential targets withoutbothering to tell them apart.
Effective counterdeception therefore depends not only on applying the right cat-egory of counterdeception methods but also on applying methods from more thanone category. Gerwehr and Glenn suggest that much more research needs to bedone to resolve the issues raised by questions such as:
What counterdeception methods should be matched to particular types ofdeception?
Which of those methods are the most effective against individual deceptiontechniques or are effective against the broadest range of deception techniques?
What are the situational factors that affect their use? Which methods require the most time or manpower to use effectively? Which methods complement or interfere with each other? Do any of the methods work against one type of deception technique but in
turn increase the vulnerability to another?
Even more recent work has been done by Stech and Elssser who extend John-son et al.s model to develop a counterdeception business process [72]. The pro-cess links previous work done by Whaley, Jones, and Heuer to the Johnson model(see Figure 5.4) and Stech and Elssser then use this process to in an effort toimprove the effectiveness of Heuers alternative competing hypothesis (ACH)method as a counterdeception tool. The first step of the process addresses the detec-tion of anomalies using techniques based on Whaleys congruity-incongruity rule.One challenge for analysts though is that the detection of anomalies (incongruities)is not necessarily evidence of detection of deliberate deception. They may resultfrom sensor malfunctions, unintentional distortion or corruption of data or infor-mation during transmission, or analytical error. In fact, deception is often successfulbecause the deception target explains away such anomalies and failing to correctlyattribute them to deception. That is where the next step in the process comes in.There must be some way of linking anomalies to deception, and Stech and Elssserpropose that R. V. Joness concepts of deception masking provides suchmeansanalyzing the anomalies through multiple information channels. The thirdand fourth steps use ACH to assess the likelihood that the observed anomalies are
5.2 The Search for Ways to Counter Deception 165
-
associated with a probable deceptive course of action (COA) and to evaluate thelevel of support for each identified hypothesis.
Stech and Elssser have also developed what they call alternative competinghypotheses for counterdeception (ACH-CD). Their most significant adaptations toHeuers original eight-step outline for the analysis of competing hypotheses are [73]:
Adding the other or unknown hypothesis to step 1 (i.e., Identify the pos-sible hypotheses to be considered). This modification supports furtherBayesian analysis of the alternative hypotheses.
Making sure that step 2, Make a list of significant evidence and argumentsfor and against each hypothesis, considers not only the case where evidencesupports a hypothesis, p(E|Hi), but also the likelihood that observing thatsame evidence if the hypothesis is not true, p(E|Hi).
Specifically considering deception-related COAs in steps 4 (Refine thematrix) and 5 (Draw tentative conclusions about the relative likelihood ofeach hypothesis).
Adding the concept of conducting operational experiments to step 8 (Iden-tify milestones for future observation that may indicate events are taking a dif-ferent course than expected) in order to provide additional intelligence thatwould reveal evidence of deliberate deception.
166 Principles of Counterdeception
Finding the dots
Connecting the dots
Seeing the pictureCharacterizing the dots
Linking anomalies to D&DR. V. Jones: Theory of spoofunmasking
Assessing support for D&D hypothesesJohnson et al.: Cognitive model offraud and deception detection
Linking evidence toD&D hypothesesHeuer: Analysis of competinghypotheses
Detecting anomaliesWhaley and Busby:Congruity theory andombudsman method
Hypotheses generationLocal and
globaldeception
Hypotheses support andsensitivity
Figure 5.4 The Stech-Elsser counterdeception business process. (Source: [72].)
-
The first two adaptations support Stech and Elsssers work on developingBayesian belief networks to model the alternative COAs, perform sensitivity analy-sis in order to analyze the diagnosticity of the evidence (part of Step 3 in ACH), andto suggest possible key indicators to look for that would reveal the adversarysintentions.
5.3 Searching for Common Themes
In Chapter 2 we saw that there was considerable agreement on a range of topics inthe deception research community, which we then organized into six general themes(Figure 2.15). We then examined these themes in the context of a generalized decep-tion cycle (Figure 2.13) and Waltzs three-level hierarchy of information. In thischapter, we propose to take a similar approach and organize the variouscounterdeception concepts, approaches, models, and methods presented in theprevious sections into general themes using a framework based on thecounterdeception definitions presented in Section 5.1. These counterdeceptionthemes are then compared to the deception themes of Chapter 2 and used to synthe-size a set of fundamental principles of counterdeception.
5.3.1 A Holistic Approach to Counterdeception
In Section 5.1, we saw that counterdeception consists of both intelligence and oper-ational functions. These can be broken down as follows:
Intelligence functions. Awareness of deception cues, detection and exposureof deception operations, and discovery and penetration the adversarys realcapabilities and intentions;
Operational functions. Negate or mitigate deceptions effect and exploit theadversarys own deception plan.
These five functional dimensions form a simple, yet useful framework for think-ing about counterdeception; however, keep in mind that these dimensions are notmutually exclusive. They are, in fact, highly interdependent and form more of a con-tinuum of functions than a set of independent activities. This suggests thatcounterdeception requires a more holistic approach than is suggested by the tradi-tional intelligence cycle. As we will see, the themes that emerge from thecounterdeception literature reinforce this idea.
Our examination of the research suggests that there are nine themes represent-ing processes and methods that would ideally work together synergistically to iden-tify and defeat the adversarys attempts at deception. Figure 5.5 shows these themesarranged within the framework of the five intelligence and operations functions.Like the functional dimensions of our counterdeception framework, these themesare themselves interdependent and reflect a holistic approach to counterdeception.
The first theme, human reasoning capabilities, is probably the most importantsince it binds the other themes together. All of the authors we have reviewed,whether in Chapter 2 or this chapter, have either explicitly or implicitly recognized
5.3 Searching for Common Themes 167
-
that the success or failure of deception occurs in the minds of the analysts, com-manders, and decision-makers who are its targets. Heuer is often quoted in thisregard [74]: Deception is, above all, a cognitive phenomenon; it occurs in ourminds. Likewise, it is clear that many of these same researchers believecounterdeception is primarily a cognitive phenomenon as well. As Whaley con-cludes after examining 47 categories of real-life professionals who deal with decep-tion [75], Successful detection procedures were found to be largely or entirelyintellectual rather than technological in nature. All of the counterdeception con-cepts dealing with human reasoning emphasize the role that broad, subtle powers ofawareness, discernment, and discovery play in distinguishing between what is realand what is deceptively constructed by the adversary. This is why this theme coversall five dimensions of our framework in Figure 5.5.
This emphasis on concepts such as acumen and intuition has interesting implica-tions for how one goes about implementing these ideas in real organizations. Forexample, Johnson et al. found that none of the 24 auditors in their study successfullyidentified the presence of fraud in all of the four cases they were given and, in fact,20 auditors failed to detect fraud in at least three out of the four cases. In addition,two auditors failed to detect any fraud in any of the four cases and seven auditorsfailed to give an unqualified opinion on the clean cases they were presented. Obvi-ously, not only is detecting deception difficult, but auditors also differ significantly
168 Principles of Counterdeception
AwarenessDetection &
ExposureDiscovery &Penetration
Negate or MitigateEffects
Exploit
Intelligence Operations
AwarenessDetection andexposure
Discovery andpenetration
Negate or mitigateeffects Exploit
Intelligence Operations
Threat assessmentCultural,organizational, andpersonal factors
Deception styles,practices, andexperience
Capabilities andlimitations
Human reasoning capabilitiesAcumen and predetectionIntuition and indirect thinkingIncreased alertness
Collection methodsControl as many channels as possible.Avoid relying on a single source.Use new collection methods unknown to theadversary.
Analytic methodsIncongruity testingAlternative competing hypothesesReconstructive inferenceWeighting of tactical indicatorsNarrative analysisMetadataBayes' theoremGame theoryModeling and simulationHistorical case studies
Self-assessmentExamine ones ownassumptions,preconceptions,expectations, beliefs, andbiases for exploitableweaknesses.
Examine which of ownstrategies and methodsof intelligencecollection, analysis, anddecision making havebeen compromised.
Organizational measuresDevils advocates and multiple advocacyCompetitive analysisCounterdeception staffsDecouple identities/missions from theories/images
Situation assessmentLikelihood ofdeception indifferent situations
High stakes Asymmetric
powerdifferences
Human andmaterial costs
Uncertainty
Counterespionage and counterintelligence operations
Counterdeception operations
Figure 5.5 Counterdeception themes in the deception literature.
-
in their fraud detection capabilities. In addition, if there is considerable variation inthe counterdeception performance of