Doc Info &
BATTLEFIELD DECEPTION FUNDAMENTALS
History has shown that there is a potential payoff to be gained by using battlefield deception. Wise military planners throughout history have used deception. It is a low cost and effective way to cause the enemy to waste his efforts. Imaginative use of deception, coupled with aggressive training, improves combat effectiveness at all levels. Throughout our military history, though, commanders viewed deception only as a war-fighting need.
Today, commanders use little deception in planning, directing, and conducting combat operations. As a result, many deception-related skills that have served our Army well in the past have been forgotten, and where remembered, have not been made part of our war-fighting capabilities Armywide. This is caused by the following factors and the myths discussed later in this chapter.
The following myths contribute to reasons why deception is not more widely used and understood:
Battlefield deception consists of those operations conducted at echelons theater (Army component) and below which purposely mislead enemy decision makers by--
The goals of battlefield deception, when discussed within the context of mission-oriented requirements, depend on the factors of mission, enemy, terrain, troops, and time available METT-T). The following goal categories, therefore, are general enough to be applicable to most situations, regardless of echelon or conflict intensity level:
Achievement of the above goals relies on deception maxims or principles that are supported by historical deception-related evidence. Other principles come from social science, decision analysis, and game theory. Still others are anecdotal in nature; although they meet the test of common sense, they are generally untested in the formal sense. Nevertheless, they have served as useful theoretical guidelines on which this doctrine has been built. The 10 maxims are--
It is generally easier to induce an enemy to maintain a pre-existing belief than to present notional evidence to change that belief. Thus, it may be more useful to examine how an enemy's existing beliefs can be turned to advantage than to attempt to change his beliefs.
Perhaps the most striking application of this principle in military deception is to be found in the selection of the invasion site and cover plan for the D-Day invasion at Normandy. It is well established that Hitler and almost all of his senior military advisors believed that the most likely place for the Allied invasion of Europe would be in the Pas de Calais region. Moreover, the Allies were aware of this belief through ULTRA intercept. Intercept confirmed that Hitler believed that the Allies would invade at Pas de Calais.
This preconception formed the basis of an elaborate deception plan keyed to reinforce this belief. "If deception targets tend to perceive what they expect, then these expectations furnish greater leverage to a deception plan--a form of mental jujitsu.1 This principle appears to be well appreciated by deception planners and is consistent with numerous studies on the psychology of perception.
There is ample historical evidence to confirm the truth of Magruder's Principles. Figure 1-1 contains entries from a historical data base. These entries (including both strategic and tactical cases) have been placed into the following categories:
There are two limitations to human information processing that are exploitable in the design of deception schemes:
"The law of small numbers" is the name given to describe one weakness in intuitive inference--best guesses. Figure 1-2 shows three events as examples:
Figure 1-1. Deception, preconception, and surprise
Another limitation of human information processing relevant to deception planning is the frequent inability of targets to detect small changes in indicators, even if the cumulative change over time is large. This is the basis for the use of conditioning as a deception technique.
Figure 1-2. Law of small numbers: some historical examples 3
Conditioning or desensitizing has an important place in the design of deception schemes. There are numerous instances of its successful application. One now-classic application of this principle was made in the breakout of the German ships Scharnhorst, Gneisenau, and Prinz Eugen from Brest on February 12, 1942. The breakout was facilitated by jamming British radars. Ordinarily this would have been a significant tip-off that something was amiss, but British radar operators dismissed it as being caused by atmospheric disturbance. This error was the result of a carefully orchestrated German ruse directed by General Wolfgang Martini, the head of the Luftwaffe Signals Service. The Germans jammed the British radar sites every day at the same time to build their belief that the atmosphere was interrupting the receipt of any signals. The British became so accustomed to the atmospheric problems that the ships were able to escape.
The Germans did not have a monopoly on this concept. It was frequently employed by the RAF for feints or diversionary operations. One example was prior to the British attack on Peenemunde on August 17, 1943. Over a period of time, the British had routinely sent Mosquitoes along the same route to bomb Berlin. This ensured that all personnel in cities along the route were constantly forced to flee to bomb shelters and that German air assets were repeatedly engaged over Berlin. On the night Peenemunde was attacked, the Germans were deceived into believing that the eight Mosquitoes were the vanguard of another attack on Berlin. The result of this deception was a highly successful ruse. At the cost of one aircraft lost to German fighters, the eight Mosquito bombers used in the diversion lured 203 enemy fighters to Berlin. Of 597 British bombers dispatched to Peenemunde, only 40 were lost and 32 damaged. All but 26 managed to attack the target. If the ruse had not been successful, it is quite possible, as one German postwar account claimed, that an additional 160 bombers would have been shot down.
A final remark about the weaknesses of human information processing is that the reading of the literature suggests that targets tend to dismiss unlikely events as impossible events. Such an idea favors bold and imaginative strategies such as Hannibal crossing the Alps or the landing at Inchon.
Figure 1-3 provides a synopsis of several events which show how repeated false alarms (cry-wolf) have historically contributed to surprise. There is no doubt that cry-wolf is an established element in indications and warning intelligence work. As Figure 1-3 shows, this method of desensitizing an enemy before an attack has been very effective.
In a paper entitled "Deception Maxims: Fact and Folklore," 4 prepared by the Office of Research and Development, Central Intelligence Agency, June 1981, the cry-wolf syndrome alone, and false alarms combined with other deception techniques were analyzed to see if they contributed to creating surprise.
Figure 1-3. Historical desensitization by false alerts
The data showed that when cry-wolf techniques were combined with other deception methods, surprise was achieved 92 percent of the time. 4
However, when deception techniques were used that did not include false alerts, surprise resulted in only 67 percent of the cases studied. The analyst concluded from this statistical analysis that combining the effects of false alerts with other deception techniques seemed to increase the chances of achieving surprise. In fact, in 23 cases, when wolf was cried and deception was attempted, surprise was achieved 100 percent of the time.
Deception becomes more difficult as the number of channels of information available to the target increases. However, within limits, the greater the number of controlled channels the greater the likelihood the deception will be believed.
Where possible, the objective of the deception planner should be to reduce the uncertainty in the mind of the target, to force him to seize upon a notional world view as being correct--not making him less certain of the truth, but more certain of a particular falsehood. However, increasing the range of alternatives and the evidence supporting any of many incorrect alternatives--also known as increasing the noise--may have particular use when the target already has several elements of truth in his possession.
It is convenient to classify deception into two types: A (for ambiguity deception) and M (for misdirection deception). A-deception increases doubt in the target's mind and lowers the probability of a correct perception by taking from or adding to alternatives. M-deception reduces uncertainty in the target's mind by having him become convinced of a particular falsehood. Either form of deception can be accomplished, incidentally, by telling only the truth.
A-deception can function by--
A classic analysis of the Pearl Harbor surprise borrowed the concepts of signal and noise from communications theory. "To understand the fact of surprise, it is necessary to examine the characteristics of the noise as well as the signals that after the event are clearly seen to herald the attack." 5 On the other hand, noise can be created by the deception architect to overpower or swamp the signal. "The idea is to give your target a kaleidoscope to play with, and then let him use it as a looking glass." 6
A simple example of a defense game shows this idea more clearly. Suppose an attacker has a choice between two locations to attack. The defender can choose to defend either location. Given this scenario, the attacker has an even chance of choosing an undefended location to attack. But, what if the attacker could convince the defender that there were three possible locations for the attack? If he could, the success probability then climbs to 2 to 3, and so forth. The probability would reach unity as a mathematical limit when the number of threatened sites grows arbitrarily too large. It is necessary that the options introduced by the attacker be both individually and collectively plausible to the target.
As a practical matter, the number of threats cannot arbitrarily grow too large. This fact was appreciated by deception planners who worked on the invasion of Sicily: "It was decided, very wisely, that to mount so many threats in the Mediterranean would stretch the Germans' credulity too far. Moreover, the fact that Sicily was almost the only objective not threatened might lead them to guess the truth. To prevent this, the simulated threats to north and west France, Pantelleria, and Lampedusa were abandoned." 7
The foregoing discussion is purposely oversimplified, but it clearly shows the principle of A-deception.
In contrast to A-deception, M-deception (or misdirection) reduces uncertainty. The strategy of misdirection is clear: to make the enemy very certain, very determined, and completely wrong. In the attack/defense game used earlier, M-deception would require the attacker to convince the defender to defend one site, while attacking the other.
Deception schemes used in practice are usually combinations of A and M types, with one or the other being dominant. Such was the case at Normandy.
The multiple attack location threats in the initial stages are evidence of A-deception. In the end phases, however, Normandy was predominantly an M-deception. Historically, deception professionals seem to have preferred M-deception. For after all, who can resist the ultimate triumph of "the sting?"
There are circumstances where deception assets should be kept in reserve despite the costs of maintenance and risk of waste, awaiting a more fruitful use.
Window, later renamed Chaff by the Americans, was easily the most cost effective electronic countermeasures (ECM) deception device introduced in World War II. However, the British were at first reluctant to use Chaff for two reasons. First, they were afraid that the Germans also had this capability and second, the British had not been able to develop an effective countermeasure. However, after much debate, the British decided to employ Chaff and did so with much success.
It is also interesting to note that concern over whether an asset will become valueless once used, or that upon compromise, an effective countermeasure can and will be developed is often exaggerated. In spite of the concern over the first use of chaff, it is still considered effective in today's sophisticated electronic warfare (EW) environment. Similarly, in the use of double agents, a refusal to believe that the agent is other than genuine has been observed to continue in the face of strong evidence of hostile control.
"Other examples of holding deception assets in reserve until the right moment include--
The Syrian decision to withhold use of its new SAM defense despite heavy losses until the opportune time in the 1973 Arab-Israeli war.
Deception activities should be sequenced so as to maximize the portrayal of the deception story for as long as possible. In other words, red-handed activities-indicators of true friendly intent-should be deferred to the last possible instant.
A scheme to ensure accurate feedback increases the chance of success in deception. This principle is virtually self-evident.
Perhaps the most dramatic example of the role of feedback in wartime deception was the intelligence provided by ULTRA, the top-secret espionage and cryptographic breakthrough that enabled the British to read the German codes. In the view of many, ULTRA information was a key element in the success of the Allied invasion of Normandy. As Lewin pointed out in ULTRA Goes to War: The First Account of World War II's Greatest Secret Based on Official Documents:
Ironically, the Allies knew through ULTRA that German troops remained in Norway, and concluded on the basis of this feedback that the deception was successful. "On Sherlock Holmes' famous observation about the importance of the dog that did not bark in the night, the significant fact for the deceivers in London was that no such major movement of troops from Norway was disclosed on ULTRA up to and beyond the time of D-Day. Here was clinching evidence that the deception plans were working.12 Yet it was a completely wrong assessment. Hitler did not move his forces because Norway was his "zone of destiny," not because he believed the British deception plan.
Deception efforts may produce subtle and unwanted side effects. Planners should be sensitive to such possibilities and, where prudent, take steps to minimize these counterproductive aspects.
Deception security is one of the causes of such side effects. One of the cardinal principles of deception folklore is that deception security is of highest importance. It is generally acknowledged that the number of knowledgeable people should be minimized, even to the point of misleading your own forces.
A good example of short circuiting an unwanted side effect occurred during World War II. Propagandists needed to convince the Germans that an Allied attack was imminent. They needed to accomplish this without encouraging resistance groups to go into action in support of an attack that would never materialize and without exposing them to German reprisals.
Another example of the Monkey's Paw effect concerns the unanticipated consequences of an otherwise successful German use of decoy V-2 sites. As Jones stated in "Irony as a Phenomenon in Natural Science and Human Affairs," Chemistry and Industry (1968);
Great care must be exercised in the design of schemes to leak notional plans to the enemy. Apparent windfalls are subject to close scrutiny and often disbelieved. On the other hand, genuine leaks often occur under circumstances thought improbable.
Two incidents serve to illustrate this principle. One occurred when early in World War II, a German aircraft heading for Cologne became lost and made a forced landing near Malines in Belgium. The three passengers, two Wehrmacht officers and a Luftwaffe major, were soon arrested by Belgian authorities. They were taken to the police station and left alone briefly. They made an attempt to burn some documents they were carrying. They were top secret documents containing attack plans for Holland and Belgium. However, the documents failed to burn and fell into the hands of Belgian authorities. The authorities believed that the documents were a part of a deception plan, because the Germans could not be careless enough to allow actual war plans to fall into the hands of the Allies.
A second example occurred in the North African campaigns. Alam el Halfa, a ridge roughly 15 miles behind the Alamein line, was a natural stronghold. It was an excellent defensive position for the British at that stage in the war. It could, however, be outflanked by advancing Germans who might be able to attack on to Alexandria. The British maps of the area were excellent, being based on captured Italian maps corrected by aerial photographs. One type of British map was thought particularly valuable by both British and German armies--the so-called "going map." This map showed color-coded regions denoting how difficult the terrain was, and what speeds could be maintained by various vehicles.
The British decided to print a false going map showing that a flanking movement would present rough going, whereas the route direct to the Alam el Halfa region was easily plausible. The map was secretly printed and placed in an armored car to be captured by the Germans. The plan worked and the Germans came directly to Alam el Halfa (over rough going, incidentally).
These examples show both kinds of misclassification error. In the Belgian case, a real windfall was dismissed as false. In North Africa, a false map was accepted as real.
A common characteristic of successful deceptions is that they were designed to co-opt skepticism by requiring some participation by the target: either a physical effort in obtaining the evidence or an analytic effort in interpreting it. The danger of this is that it is possible to be too subtle, which carries with it the risk that the deception story will not be perceived at all.
There is a delicate balance to be struck between obviousness and subtlety, with the attendant twin risks that the message will be either misunderstood or dismissed as a plant. To the deception professional, this is the essence of the art.
There are generally two categories of deception failures:
The first deception plan was code-named Albion. It was an elaborate deception to cover the mobilization and movement of forces to the East for the attack on Russia. The plan contained two major operational components, SHARK and HARPOON.
SHARK was intended to convey the impression that a large combined force would invade the southeast coast of England at four locations between Folkestone and Worthing. The combined force, to include eight infantry divisions, was to be preceded by an airborne unit to 'secure beachheads and, if possible, to take a number of airfields.' The Luftwaffe was to achieve air superiority, protect the invasion fleet, drop the airborne units, support the ground forces, and airlift additional ground troops. Naval units were also supposed to participate in clearing invasion routes through the British minefields, transport the invasion force, and provide covering fire during the landing.
Originally intended to begin in March and April 1941, directions and planning were slow, probably because of the press of real operations which almost invariably took precedence over deception. Preliminary actual steps included highly visible training exercises, swimming instructions for nonswimmers, paradrops and beach assaults using blank cartridges but real landing craft. This latter activity was a major deficiency in the deception story. Since only 5 landing barges and 10 fishing smacks were available to transport the assault force, the deception activities were not believable.
A cover operation for SHARK, designated HARPOON, was notionally intended to draw British forces away from the 'intended assault' area. This added credibility to the 'attack.' Two operations were planned:
It is unclear whether the Russians saw through the deception, or simply decided their forces were adequate to overcome the large force the Germans were trying to portray. In either case, the deception was not successful. It probably failed for the following reasons:
The third example occurred during World War II, when Soviet radio deception attempts against the Germans along the Eastern Front were common, but generally unsuccessful. Careful German analyses of other available intelligence (air reconnaissance and agent reporting) revealed the true deceptive nature of the attempts. They were, as in the ELEPHANTIASIS operation, single-channel efforts with no additional means or measures used to support the deception and enhance plausibility.
Probably more significant was the frequency of the attempts. A deception occurred about once every two weeks. It is probable that the Soviet command structure and intelligence apparatus were desensitized to the point of ignoring the ploys. While such repetitive actions are sometimes used to lull an adversary into a false sense of security prior to a genuine attack, the careless and poorly structured nature of these efforts probably revealed them as deceptions.
The fourth example is probably the largest scale deception failure on record. It was the World War II Allied operation code-named COCKADE. Conceived in early 1943, its major objective was to conceal the weaknesses of Allied forces in Britain. COCKADE was intended to discourage the transfer of enemy forces to the Russian front. It had three subelements: STARKEY, TINDALL, and WADHAM.
STARKEY, the major component, was composed of a number of separate but presumably mutually supporting operations, including actual training exercises, air and naval operations, and combined operations (commando) teams.
Planning began in April 1943 with a target launch date of September 8. However, the process of cutting back on the scale of the plan began early. This was demanded by Allied leadership, due to the fact that there were fewer resources available than earlier in the war."18
"FORFAR BEER made three attempts. The first turned back after sighting a German trawler. The second was aborted due to bad weather and the third terminated when the troops could not scale the cliffs of the French coast.
WADHAM was intended to portray the story of a large-scale combined air and sea attack on the Brittany peninsula. The objective, again, was to freeze German forces in that area. In this case, American and British forces were involved in an assault planned for September 30, 1943. A prime objective was to capture Brest and implicitly neutralize its U-boat pens and those at Lorient and St. Nazaire.
Active measures included actual bombing of the submarine pens and a less-than-convincing commando raid, code-named POUND.
"The target was the Isle of Ushant. All this was intended to support the story that an intelligence sortie was attempting to determine the strength of defenses in the area."24
COCKADE and its subelements suffered from some fairly major deficiencies in the resources available for execution. The Germans' disdainful reaction may also be explained in terms other than poorly constructed deception. Two writers have indicated a major German intelligence success branded COCKADE as a hoax, when a July 29 transatlantic telephone call between Roosevelt and Churchill revealed that COCKADE was a trick. Although the call was presumed secured by the A-3 scrambler, the Germans had in fact broken that system by the fall of 1941. They had routinely monitored a broad spectrum of mid- and high-level voice communications.
The major cause of failure, however, was the total implausibility of an invasion of the continent at that stage of the war. The total picture of Allied strength and preparations that the Germans gained was from sources so numerous that they could not all be totally manipulated or controlled. Evidence showed clearly that such an attack was unrealistic in 1943.
The fifth example is a tactical deception which occurred later in World War II in support of OVERLORD, the invasion of France. It, too, can be classified as a technical failure. It failed because of inadequate planning, coordination, preparation, and time, combined with some degree of bad luck. It was code-named Accumulator.
The failure could also have been caused by the absence of the other aspects of an actual invasion fleet. Missing were the radar signatures of a large group of ships which would undoubtedly have been accompanied by air support and ECM. Deception story portrayals by one means have less credibility than stories portrayed over a number of means.
Also, by June 13 the magnitude of the Normandy force was clear to the German military leadership. Hitler apparently still believed an attack would come in the Pas de Calais area. This, combined with the general disorganization in northern France, probably prevented any serious thought of a major shift of forces in the west.
The sixth example was code-named IRONSIDE. In early 1944, with the Allied decision made to invade Normandy, the primary objective was to minimize opposition to the attacking force.
Figure 1-4 shows the previous deception failures in easy-to-use tabular format. The intent is not to dwell on failure but, rather, to portray the immense scope of deception planning, the fragile nature of deception operations, and the absolute necessity for total integration of the deception effort into the decision-making process.
Our ability to fight in accordance with the basic tenets of AirLand Battle Doctrine--agility, synchronization, initiative, and depth--is enhanced by using battlefield deception.
The effective use of deception allows us to take the initiative by doing the unexpected and inducing the target to react to our operations. Deception allows us to--
Battlefield deception operations, by their very nature, imply taking calculated, prudent risks in order to gain the tactical and operational advantage over the enemy. Planned deceptions allow us to sequence the presentation of the battlefield to the enemy in the manner in which we wish him to view it. In the defensive, battlefield deception allows us to portray inaccurate dispositions and capabilities that hide our true weaknesses. This can effectively negate the enemy's choice of the time and place of battle.
Figure 1-4. Reported Deceptions Failures.
In both the offense and defense, battlefield deception enhances the conditions which allow the friendly commander to effectively mass his forces at the decisive time and location on the battlefield. Successfully managed, deception operations give us the element of surprise over the enemy. In the defense, this includes making the enemy attack where he perceives our weaknesses to be or gearing his intelligence activities toward notional rearward activities. We inject notional combat information and intelligence into his decision-making process. This influences the outcome of his decisions and requires him to reconfirm information or dedicate additional intelligence resources toward our deceptive activity.
In the offense, battlefield deception assists our offensive spirit by giving our commanders freedom to develop a greater number of alternative courses of action. Deception operations induce the enemy to view the battlefield the way we want him to. This causes him to take actions favorable to and exploitable by friendly operations. Because of induced misperceptions of the battlefield, the enemy in the defense is not given time to identify the composition of our forces and mass his forces or supporting fires against the attack. Successfully planned and executed battlefield deceptions give our commanders the ability to act faster than the enemy can make decisions. Battlefield deception keeps the enemy reacting to false friendly dispositions, intentions, or capabilities.
As with other imperatives for success on the AirLand Battlefield, deceptions must be an integral part of the planning process. In order to optimize the desired effect upon the enemy, they must be synchronized with the true combat mission. These effects induce inappropriate focusing or diffusing of enemy combat power. They may cause the enemy to misperceive friendly capabilities and intentions in a manner which results in enemy actions that can be exploited. The former effect can create friendly advantages in terms of time, distance, location, force ratios, or mission mismatches. The latter creates friendly advantage primarily in terms of ensuring that inadequate time exists for enemy reaction to true operations, regardless of if or when they are discovered. Functional activities (such as EW, fire support, intelligence, and engineering), which have embedded deceptive intent within the operational plan, should synchronize their supporting plan tasks to achieve both operational and deception objectives. The operational plan is identified in the deception annex.
Battlefield deception, as with other operations, must be flexible and continuously synchronized with the changing friendly and enemy situations. Synchronizing deception activities, with ground truths or with the desired enemy perception, provides our commander the maximum economy of force of total combat resources.
Battlefield deception is an important foundation to the C3CM strategy for AirLand Battle. Our potential adversary's ability to perceive and manage the battlefield with clarity and certainty accents the importance of planning and integrating a C3CM strategy into our combat operations. Battlefield deception is employed in concert with the three other components of C3CM:
Battlefield deception complements the other three components of C3CM in both counter-C3 and C3-protect roles. In countering enemy C3 capabilities, battlefield deception can be used to inject false truths into the enemy's decision-making process. These false truths will distort his ability to respond to the true current situation. This is accomplished by many means including portraying false friendly intentions, capabilities, and dispositions, which can cause the enemy to--
Battlefield deception can also assist in a C3-protection role. For example, deception operations can nullify or degrade the enemy's target acquisition and offensive capabilities by causing him to diffuse his firepower or to commit maneuver assets at inappropriate times and locations. Deception also assists the operational security posture of the operation by masking indicators of true intent. (See AR 525-20.)
There are several important cornerstones for the development of successful battlefield deception operations that all commanders must thoroughly understand and apply. (See Figure 1-5 ) These considerations fall into three broad areas: intelligence support, integration and synchronization, and OPSEC.
Figure 1-5. Cornerstones of Battlefield Deception
The threat to successful AirLand Battle operations from enemy intelligence and combat operations accents the importance of using our intelligence estimates in developing operational and tactical plans. Battlefield deception operations rely extensively on the same level of timely and accurate intelligence as do combat operations. To ensure that friendly operations are viewed by the enemy as plausible, and subsequently authentic, we need to know--
Once we have determined where the enemy is susceptible to battlefield deception and what the objective of our deception will be, we must begin to integrate and synchronize deception operations and events into our true combat operation.
This underlines the importance of planning and executing deceptions as part of the planning and execution of our true operations. There should be no such thing as a deception planned separately from the true operation.
History has shown that the deceptions that stand the greatest chance of being accepted as our true capabilities, intentions, or dispositions are deceptions that are--
OPSEC is equally important for deception since it is an integral aspect of overall combat operations. OPSEC and deception are mutually supporting activities. OPSEC supports deception by eliminating or reducing the indicators which give away our true intentions or display our deceptive intent. Deception can produce signatures behind which our true operations may hide. In general, given that the primary aim of deception is to influence the enemy commander, OPSEC establishes the base of secrecy that is necessary for battlefield deceptions to be successful. OPSEC gives us the capability to look at ourselves in order to identify our vulnerabilities and the profiles that we present to the enemy. It is essential that if battlefield deceptions are to be used to gain surprise over the enemy, then our unit's true intentions, dispositions, and capabilities must be concealed, manipulated, and distorted as well as falsified. OPSEC is essential to all successful deception.
OPSEC is not an administrative security program. OPSEC is used to influence enemy decisions by concealing specific, operationally significant information from his intelligence collection assets and decision processes. OPSEC is a concealment aspect for all deceptions, affecting both the plan and how it is executed. (See AR 530-1 for additional information.)
Deception, employed properly, can help create surprise, thereby significantly enhancing the commander's opportunity for success.
Battlefield deception can be used during prehostilities, periods of hostilities, and open warfare. The military commander is confronted with achieving surprise over the enemy by maintaining security. It is not essential that the enemy be taken totally unaware, but only that he becomes aware too late to react effectively.
The key to successful deception is security. It is possible to hide the real and portray the false, but without good indicator security, the real operation and the supporting deception operation are at risk.
We must assume that any potential adversary is well versed in US Army doctrine--the way we conduct our operations. He will expect our units to behave in certain ways, and if we stray too far, his intelligence analysts will question our conduct. Deceptions must be consistent with doctrinal norms and how units apply those norms in combat.
If the enemy's perception of our doctrine and the doctrine itself are different, we want to play on his perception of the doctrine. The successful deception planner is the one who approaches the problem by putting himself in the enemy's shoes and developing a story believable from this vantage point.
Patterns are procedural indicators that give a unit an operational profile--how units execute doctrine. Enemy analysts use these patterns to identify the unit and predict its intentions. Once the enemy notes a pattern in the unit's activities, he expects to continue seeing that pattern. Changes in the pattern lead the enemy to question friendly activity, so it is important to use established friendly patterns in the deception.
Since often we are unaware of the patterns we have established, it is difficult to ensure that the required profile detail is present. OPSEC surveys are specifically designed to provide such information. We can achieve the desired operational plausibility by ensuring that deception planners develop deceptions as if they were genuine operations.
A commander who really plans to feint left and conduct the main attack on the right might initially direct his units to plan for a simultaneous attack. During the attack preparations, subordinate unit staffs would execute their normal patterns for this action. When appropriate, the commander could change his order to the appropriate unit and direct the conduct of a feint only. An imaginative planner might find other ways to display established patterns to the enemy. It is important that the enemy sees what he expects to see.
A second consideration is the possibility of deliberately creating patterns in our deception plans. Repeated employment of a particular deception technique or measure will certainly establish a tell-tale pattern. This could signal a deception that in itself is exploitable through subsequent deceptions. Variety and creativity are vital to continued success. Battlefield deception planners must ensure that neither they nor their plans become too predictable.
The following factors of deception are taken from previous operations. They should be carefully considered in planning deception activities. They are as valuable today as they were when the Greeks placed the wooden horse before the walls of Troy.
-- Effect necessary coordination.
-- Promulgate tasks to involved units.
-- Present the deception story to the enemy decision-maker through his intelligence system.
-- Permit the enemy decision-maker to react in the desired way--to pursue a desired course of action.
Training in battlefield deception offers added benefits to commanders. The brainstorming associated with developing a workable deception plan causes a greater appreciation for enemy tactics, strengths, weaknesses, and capabilities. This process also encourages more thoughtful and imaginative approaches to friendly doctrine and habits. Deception training contributes to our understanding of--
The projection of the measures (false indicators) and the countersurveillance actions to conceal movements and dispositions need to be analyzed to determine the success of the training exercise.
Wars are fought with skills learned through schooling, exercises, operational experience, and self-study. Because of various necessary artificialities, peacetime schooling and exercises tend to lose sight of some of the harsh lessons of war. The essential need for secrecy and information control in war are among the lessons often forgotten.
Deception will work on the battlefield only if it has been practiced in training. The Vietnam War illustrates--
Battlefield deceptions are planned in a manner similar to the planning of standard combat operations. Each component of deception is applicable at operational and tactical levels, but varies in scope. The components of battlefield deception are objectives, target, story, plan, and events.
The deception objective is the ultimate purpose of the deception operation. It is presented as a mission statement. The objective specifies what action or lack of action the enemy must be made to take at a specific place or time on the battlefield as a direct result of the friendly deception operation. Deception objectives relate directly to inappropriate actions and responses that we want the enemy to take. These actions can then be exploited by friendly operations.
The target of battlefield deception operations is the enemy decision-maker. He has the authority to make the decision that will execute the deception objective desired by the friendly commander.
Battlefield deception targeting can occur in two ways:
The deception story is the friendly intention, capability, or disposition which the enemy is to be made to believe.
The deception plan outlines which specific operations, displays, or secrets must be used to convey the deception story to the target. It takes the form of a standard operation plan (OPLAN). It is included in the deception annex. Some deception tasks contained in the deception annex should be moved to paragraph three of the OPLAN or operation order (OPORD) or other supporting functional annexes.
Deception events are friendly indicators and actions that present specific parts of the total deception story to the enemy's intelligence sensors. Some deception events, given the enemy and friendly situation, can be described as nonaction or delayed-action in nature. An example would be delaying the movement forward of logistic bases or artillery support until shortly before a deliberate attack.
Figure 1-6 shows the difference in scope of the deception components at various levels of deception employment.
Deception operations are constrained, but not forbidden, by international agreements. Ruses of war and the employment of measures necessary for obtaining information about the enemy and the country are considered permissible. The following excerpts are taken from FM 27-10.
Ruses of war are legitimate so long as they do not involve treachery or perfidy on the part of the belligerent resorting to them. They are, however, forbidden if they contravene any generally accepted rule.
Figure 1-6. Deception component purpose by echelon
Treacherous or perfidious conduct in war is forbidden because it destroys the basis for a restoration of peace short of the complete annihilation of one belligerent by the other.
It is especially forbidden to make improper use of a flag of truce, the national flag, the military insignia and uniform of the enemy, or the distinctive badges of the Geneva Convention.
Flags of truce must not be used surreptitiously to obtain military information or merely to obtain time to effect a retreat or secure reinforcements, or to feign a surrender in order to surprise an enemy. In practice, it has been authorized to make use of national flags, insignia, and uniforms as a ruse. The foregoing rule (Hague Regulation (HR), Article 23, paragraph F of Treaty Series 539 (sic)) does not prohibit such employment but does prohibit their improper use. It is certainly forbidden to employ them during combat, but their use at other times is not forbidden.
The use of the emblem of the Red Cross and other equivalent insignia must be limited to indication or protection of medical units and establishments and the personnel and material protected by GWS and other similar conventions. The following are examples of the improper use of the emblem:
1 Jervis, Robert, "Hypotheses on Misperception," World Politics (APR 68), p. 455.
2 A. Tversky and D. Kahneman, "The Belief in the Law of Small Numbers," Psychological Bulletin 76 (1971), App. 105-110. (Paraphrased.)
3 A. Tversky and D. Kahneman, "The Belief in the Law of Small Numbers," Psychological Bulletin 76 (1971), pp. 105-110.
4 "Deception Maxims: Fact and Folklore," Central Intelligence Agency.
5 Roberta Wohlstetter, "Pearl Harbor: Warning and Decision," a synopsis of her ideas.
6 Eric Ambler, "Send No More Roses," (London: Weidenfeld & Nicolson Limited, 1977) p. 62.
7 C. Cruickshank, "Deception in World War II," (New York: Oxford University Press, 1979) p. 52.
8 Robert Axelrod, "The Rational Timing of Surprise," World Politics (JAN 79), pp. 228-246.
9 Robert Axelrod, "The Rational Timing of Surprise," World Politics (JAN 79), p. 244.
10 Robert Jervis, "Hypotheses on Misperception," World Politics 20, no. 3 (APR 68), Hypothesis no. 14.
11 Ronald Lewin, "Ultra Goes to War: The First Account of World War II's Greatest Secret Based on Official Documents," (1978), p. 299.
12 Ibid, p. 310.
13 Charles Cruickshank, "Deception in World War II," (1979) p. 56.
14 Ibid, p. 56.
15 R. V. Jones, "Irony as a Phenomenon in Natural Science and Human Affairs," Chemistry and Industry (1968), p. 473.
16 David Mure, "Master of Deception" (1980), pp. 81-82.
17 Dr. Alan F. Wilt, "'SHARK' and 'HARPOON': German Cover Operations against Great Britain in 1941," Military Affairs, vol 38, no. 1, (FEB 74), pp. 1-2 (Discussion).
18 C. Cruickshank, "Deception in World War II," (1979), pp. 61-84. 19 Ibid.
20 C. Cruickshank, "Deception in World War II," (1979), pp. 61-84.
22 C. Cruickshank, "Deception in World War II," (1979), pp. 61-84.
25 C. Cruickshank, "Deception in World war II," (1979), pp. 200-201.
26 C. Cruickshank, "Deception in World War II," (1979), pp. 200-201.
27 Ibid, p. 159.
28 C. Cruickshank, "Deception in World War II," (1979), p. 159.
29 C.J.C. Molony et al, "The Mediterranean and Middle East," vol V, The Campaign in Sicily 1943 and the Campaign in Italy 3 September 1943 to 31 March 1944, pp. 724-754.