Combat Zones that ‘See’ Everything
In his recent thriller Prey, Michael Crichton describes an application of nano-technology running amok. In his story, literally millions of molecule-size machines learn to function as a “swarm” that seems to take on an identity greater than the sum of its parts, with dire consequences to any nearby living thing.
In real life, this capability is known as nano-technology. Science has not yet reached this level. We are still experimenting with basic nano-technology principles, and where it will lead is anyone’s guess.
In his science fiction epic, A Deepness in the Sky, Vernor Vinge wrote of a future where microscopic machines with unique specialized functions literally performed most of the tasks currently ascribed to manufacturing, processing, and distribution, and – more to the point – data collection.
Data collection surrounds us in today’s modern world.
Look around you in any public place: the corporate lobby, the bank, the library, school classroom – even many street corners. Discretely placed video cameras scan our every move, record our every act.
Civil libertarians often decry what they perceive as an unnecessary invasion of privacy, but in today’s security-conscious world, most people take comfort in the presence of these ubiquitous eyes.
the average citizen, this kind of surveillance goes unnoticed, except when it
impacts us directly, as when a driver is cited for running a red light as
captured on an ever-present video monitor. For the soldier fighting a
street-to-street campaign, however, as in
This concept has not been lost on Pentagon planners. The Defense Advanced Research Projects Agency (DARPA) recently issued Broad Agency Announcement (BAA) 03-15 as a Proposer Information Pamphlet titled Combat Zones That See – CTS for short.
CTS is far-reaching in concept and scope. The Objective paragraph of BAA 03-15 states:
“[CTS] explores concepts, develops algorithms, and delivers systems for utilizing large numbers (1000s) of cameras to provide the close-in sensing demanded for military operations in urban terrain. Automatic video understanding will reduce the manpower needed to view and manage this monumental collection of data and reduce the bandwidth required to exfiltrate the data to manageable levels. The ability to track vehicles across extended distances is the key to providing actionable intelligence for military operations in urban terrain. [CTS] will advance the state of the art for multiple-camera video tracking to the point where expected track lengths reach city-sized distances. Trajectories and appearance information, resulting from these tracks, are the key elements to performing higher-level inference and motion pattern analysis on video-derived information. [CTS] will assemble the video understanding, motion pattern analysis, and sensing strategies into coherent systems suited to Urban Combat and Force Protection.”
today’s world this means deploying small video cameras unobtrusively in
selected locations and feeding their data into a smart system that will
intelligently evaluate the overwhelming amount of raw information to produce a
useful stream of intelligence. But the intent of BAA 03-15 goes far beyond this.
about miniature unmanned aircraft similar to radio-controlled model airplanes
flown by modelers? Or how about similar craft guided internally by intelligent
analysis of incoming data? How about a squadron of these small aircraft guided
centrally by a distributed network carried aboard the units?
with these airborne resources miniature ground units controlled similarly,
little self-contained vehicles that can lurk in appropriate spots spying on
activities around them.
key to all this is the first-order intelligent machine data manipulation that
will condense the raw data to something usable, and that identifies especially
anticipates awarding $4 million per year for the first four years – all to a
single contractor who can demonstrate its ability to meet the program goals
detailed in BAA 03-15. Interestingly, DARPA included an additional unstructured
goal: the selected contractor must develop the original concept beyond the
anticipated structure outlined in BAA 03-15. In effect, the selectee must go
where no one has gone before – and be very convincing about it.
confidential source has revealed to me that an entrepreneurial firm consisting
of several bright guys, some with military experience, is homing in on an
outside-the-box solution that is reminiscent of Crichton’s Prey
and the nano-machines of modern science fiction. What these guys are developing
will revolutionize not only CTS, but the underlying fabric of ordinary life as
we know it.
group is on the verge of demonstrating a “swarm” of near-microscopic
nano-devices that function in consort with each other to produce either a visual
image or an audio image of whatever is in the vicinity of the “swarm.” The
more nano-devices that are present, the more fidelity or resolution they can
resolve and transmit. These little particles can be made “sticky” so that
they attach themselves to almost anything: a wall, a tree, a vehicle, even a
person’s clothing. It may even be that they can be made selectively sticky, so
that they only will stick to something when told to do so.
envision this battlefield scenario in the near future:
large fans are stationed outside the city limits of an urban target that our
guys need to take. Upon appropriate signal, what appears like a dust cloud
emanates from each fan. The cloud is blown into town where it quickly
dissipates. After a few minutes of processing by laptop-size processors, a
squadron of small, disposable aircraft ascends over the city. The little planes
dive into selected areas determined by the initial analysis from data
transmitted by the fan propelled swarm, where they disburse their nano-payloads.
After this, the processors get even more busy.
minutes the mobile tactical center has a detailed visual and audio picture of
every street and building in the entire city. Every hostile has been identified
and located. From this point on, nobody in the city moves without the full and
complete knowledge of the mobile tactical center. As blind spots are discovered,
they can quickly be covered by additional dispersal of more nano-devices.
air and ground vehicles can now be vectored directly to selected targets to take
them out, one by one. Those enemy combatants clever enough to evade actually
being taken out by the unmanned units can then be captured or killed by human
elements who are guided directly to their locations, with full and complete
knowledge of their individual fortifications and defenses.
fact, since the combat soldier of the near future will almost certainly be
outfitted with complete wearable computerized armor, including a real-time
ability to see virtual images projected in front of him, or on his visor, or in
some other highly interactive manner, each warrior will have a detailed tactical
picture of his entire surroundings, with the threat points appropriately
identified, keyed to his ability to eliminate them.
it sound like science fiction? Sure – for right now. But according to my
source, all this is coming, and sooner than you think. He says, regretfully,
that right now these nano-devices only transmit monochromatic images.
what? Only black-and-white real-time images plus sound of every enemy combatant
in town? This is a shortcoming?
don’t think so!
the dust settles on the competitive bidding for BAA 03-15, and after the first
prototypes are delivered several years from now, our guys are in for a
mind-boggling treat at the expense of the bad guys.