• Keine Ergebnisse gefunden

ELABORATING TECHNOLOGY AS PRODUCT OR PROCESS

What technologies are likely to have the greatest impact on strategic stability in the decades ahead?

Multidisciplinary teaming in diverse research fields is resulting in an explosion of technologies. Cross-fertil-ization among materials science, sensors, diagnostics, robotics, nanotechnology, synthetic biology, genetics, information technology, neuropsychology and the cognitive sciences, micro-electronics, quantum effects, photonics, energetic materials, propulsion, space ve-hicles, agile manufacturing, automated laboratories, and other fields, including big physics, has opened doors that even the greatest minds recently did not expect. Accelerated engineering techniques and ad-vanced industrial practices are rushing through some of those doors. Not only have basic and applied sci-ences grown closer together in many fields, but theo-retical and experimental sciences have expanded to include what may be a third arm of science—high performance simulations13 and a fourth arm, data-intensive scientific discovery.14 These four arms now overlap and are synergistic, further accelerating suc-cessful S&T and reducing costs.

A rapid and diverse compounding of technologies can make forecasting problematic, but historically not all paths are explored and even fewer persist over time. We may pursue basic science for its own sake, but sustained investment in applied S&T requires a demonstration of utility. For strategic players, that utility is predominantly calculated relative to the tech-nology and strategy of others. Stimulus-response and measure-countermeasure are not the only dynamics

tional pull or common attractor that brings different technologies into a relationship with each other.

Consider the basic nuclear delivery systems. Re-placing reciprocating engines with jets, the manned bomber flew ever higher and faster to overcome de-fenses. To escape new high altitude air defense mis-siles, bombers returned to low altitude using terrain-following radars, electronic countermeasures, and chaff to escape the technological response, the look-down/shoot-down interceptor. With stealth technol-ogy, bombers returned to high altitude, but have kept the option to go low again as concerns about bi-static radars and networked sensors complicate their future.

Large, inaccurate, liquid-fueled, surface-launched missiles, initially of medium range, were replaced with solid rocket ICBMs, quickly launched out of hardened underground silos. These were supplement-ed by the development of ballistic missiles, carrisupplement-ed on nuclear-powered submarines, that were launchable underwater.

To reduce costs per warhead, improve military ef-fectiveness, overwhelm defenses, and limit damage around the intended target, all means of delivery took advantage of increases in accuracy and reduction in the size of warheads. Bomber loads were increased with standoff ballistic and air launched cruise mis-siles (ALCMs). Stealth was even applied to cruise missiles. The single large warheads on ballistic mis-siles were replaced with multiple independently-targetable re-entry vehicles (MIRVs) accompanied by

“penetration aids (PENAIDS)” such as dummy decoy warheads. Small, fast, maneuverable re-entry vehicles replaced large, slow, blunt body ballistic re-entry vehicles that could be intercepted by advanced air defense systems.

To find targets, provide early warning of attack, communicate with forces and even with the enemy, new generations of sensors, communications, and data processing pushed the electronics revolution to provide accuracy, reliability, and survivability in the nuclear environment. Though this intense bilateral competition in both offensive and defensive military technology at the height of the Cold War seems alien to our world today, in reality intense global technol-ogy development continues to fuel many dual-use applications that have significant implications for the future of strategic stability.

Over the next 20 to 40 years, what technologies could be the counterpart of the World War II and Cold War developments? Will weapons of concern be more or less powerful? Nuclear or non-nuclear? Will they be explosive or even kinetic? Will their delivery be faster or slower, more or less discriminate? Will they involve physical or functional “kill mechanisms?”

Will their delivery systems be manned or unmanned?

Will situational awareness be more complete or much dimmer? These issues will be extremely important for considerations of strategic stability. Examining specif-ic technology paths in light of such questions makes possible alternatives seem more concrete, but cau-tion is warranted. Given that much more technology will emerge in the years ahead and many paths will be dead-ends, the strategy of “learning to fish” rather than “receiving a fish” is likely more valuable. More than picking winners, we must try to understand the game.

We often misjudge how steep the classic learning or performance curves for a given technology will be.

Early enthusiasts may overestimate progress only to

geration followed by underestimation is as common among experts as it is among the “talking heads” and is an important amplifier of technological surprise.

Other amplifiers include the growing portfolios of technologies near application that are then packaged by others differently than we might anticipate. These

“latent” technologies are often open to many players.

Thus, the short lead times to implementation and di-verse packaging almost guarantee that multiple play-ers will surprise each other.

In a sense, we are looking at how science fiction today, often speculating from basic science facts, be-comes applied technology in the future. Science fiction books and films frequently go beyond the possible, but science fiction sometimes points toward what be-comes real. Consider the atomic bomb images of Rob-ert Cromie (1895) and H. G. Wells (1914).15 Or science fiction may become approximately true by analogy or function. Consider Sir Arthur C. Clarke’s wormhole camera that could look back in time.16 Consider also that expensive, exclusive, limited capabilities avail-able only to a few large governments today eventu-ally may become cheap, ubiquitous, multifunction capabilities for millions of individuals in the future.

For example, modern, mobile smart phones with digi-tal cameras, computers, sensors, global positioning system (GPS), packet switching, and Internet access look back not just to science fiction but to technologies funded not that many years ago through DARPA for the DoD.

A look at categories one might find in any tax-onomy of technology maturity might be useful. Even

“impossible science” bounds problems and provides insights. From the perspective of even the most theo-retical science, the wormhole camera postulated by Sir

Arthur C. Clarke is emphatically fiction.17 No one can be certain his wormholes really exist, and the idea of a consumer camera that could exploit such a cosmologi-cal speculation to look into the past seems out of this millennium. Nevertheless, the use of staring sensors far more advanced than the security cameras found at automatic teller machines (ATMs) and in parking lots to document and revisit past events and patterns is now commonplace. New networked, highly sensitive, multispectral, mobile, and often miniature sensors and surveillance systems will acquire immense data that must be processed by high performance comput-ers whose capabilities are currently growing faster than Moore’s law.18 This has important implications in the decades ahead for delivery platforms such as aircraft, submarines, and mobile missiles that rely on location uncertainty or stealth for their survival and/

or effectiveness.

Categories overlap and technologies move be-tween categories. Science fiction’s canonical “death rays” were once, at best, theoretical science. They are now breakthrough science as, for example, high-energy lasers for industrial purposes approach power levels necessary for effective weapons. High-energy lasers passed through a phase in which they were extrapolat-ed S&T as militaries speculatextrapolat-ed about the future after seeing so many low-powered lasers on the battlefield.

Light Emitting Diodes (LEDs) are an enabling industry rapidly becoming a ubiquitous technology in the quest to reduce demand for electricity, much of which is provided by carbon fuels linked to other strategic is-sues such as overseas energy dependence and climate change. At the same time, LEDs are increasingly com-ponents of military systems and may provide another

abling industries include rapid prototyping, agile and additive manufacturing, and, in the chemical indus-try, flow-process micro-reactors, all potentially dual-use technology that is becoming globally accessible.

Strategic stability can also be influenced by tech-nologies far short of the cutting edge. Nuclear reac-tors became status symbols for a number of emerging nation-states and remained so even after new tech-nologies stalled and the economics of nuclear power turned dim. In some cases, the nuclear technologists recruited for what turned out to be disappointing do-mestic nuclear power programs emigrated to the West or turned to other fields. Some, however, became involved in nuclear weapons work of proliferation concern.

Biotech is a new status technology, all the more wor-risome because, as with the chemical industry, contro-versial activities often migrate out of the rule of law democracies to avoid regulation or “NIMBY (Not in my back yard.).” Like the boy who cried “Wolf!” the bio-security community warns again and again that biological weapons are becoming WMD that could be available to small groups or individuals and certainly to most nations of concern. A few attempts to use logical weapons have taken place, but the WMD bio-logical “wolf” has not yet struck. It could. What if it does? Similarly, a blurring between cyber crime and cyber warfare is taking place as information technol-ogy hubs grow in troubled regions. Does our interde-pendent networking give us greater redundancy and robustness or more common modes of failure?

The geostrategic impact of status technologies is un-even. Nuclear energy provided political top cover for covert weapons programs in India, North Korea, and Iran, but the tragedy of Chernobyl, Ukraine, may have been a catalyst that accelerated the end of the Soviet

Union. Failed chemical and biological terrorism by the cult Aum Shinrikyo brought about its suppression and the successful prosecution of its leadership and has mobilized governments, industry, and science or-ganizations to revisit rules of responsible science.

Students of strategic stability often focus on mo-nopoly technologies such as stealthy aircraft like the F-117A fighter bomber and the B2 strategic bomber or the hypersonic boost-glide vehicle, looking to see when they will become oligopolistic technologies, avail-able to a number of the great and rising powers. In time, these may become new baseline technologies in the same way that unmanned aerial vehicles (UAVs) are spreading even to nonstate actors. When a technology spreads, however, it may not be of the same value to different players. Whatever their military value, dif-ficult to detect explosives are an asymmetric technology of particular value to terrorists. Nonlethal weapons are often criticized as asymmetric advantages for in-tervention or suppression. Advances and constraints on technology do not affect all players equally, and this too can create instabilities.

Most technology that may ultimately influence strategic stability contributes incrementally and as components of systems, not as dramatic “silver bul-lets.” Accretion technologies where use builds up over time, such as the vacuum tube, the transistor, and the solid state micro-chip, have radically transformed weapons and war, yet they are seldom seen to alter stability calculations, except perhaps to the degree that they may become massively vulnerable to cyber attack or to nuclear weapons effects such as electro-magnetic pulse (EMP).19

Some of these embedded technologies, however,

• “Butterfly effects,” wherein small changes in ini-tial conditions result in radically different out-comes,20

Horseshoe Nail effects,” wherein a small loss un-der the wrong conditions yields a large unde-sirable outcome,21 and

• “Transmutation effects,” wherein accumulation of small improvements in quality may morph into a major new level of performance.22

The “Y2K millennium bug” provides some insight into the implications of such highly leveraging ef-fects.23 Y2K glitches proved far less serious than some had predicted, but reports spotlighted problems in embedded microchips in older military systems and in government procured equipment that was not from the larger civilian marketplace. The Y2K experience accelerated interest within the Pentagon in the advan-tages of using “commercial off-the-shelf” (COTS) pro-curement to obtain economy of scale price advantages and also the quality advantages of dynamic compe-tition among ever more mature technologies. High volume sales can also permit more quality evaluation in a greater variety of environments. Low volume procurements and deployments can complicate both quality control and risk assessment.

On the other hand, the very civilian electronics be-ing used by the military to modernize more cheaply and quickly through COTS may have vulnerabilities in a hostile military environment. In the nuclear context, civilian electronics, even those ruggedized for rough consumer use, are seldom hardened against EMP and thus may be vulnerable to high altitude nuclear deto-nations whose blast, heat, and other radiation effects otherwise may not reach close to the earth. Also, cyber

hacking that can be expensive to financial institutions in peacetime could be devastating in command and control systems in time of war. In short, technology developments, ranging from weapons themselves to components of nonweapons can advance strategic sta-bility or militate against it, depending on scenarios.