James Lee Phillips
Mar 23, 2012
Featured

Naval Research Lab: Robots at play in dangerous games

The Navy Center for Applied Research in Artificial Intelligence develops software for cognitive robotics and human interaction using robots such as OctaviaAn authentic representative sample of potential battlefields in microcosm -- that’s the method of an extensive project at the Naval Research Lab. Advancing the use and effectiveness of cutting-edge military robotics is one of the chief goals. 

“The NRL’s new Laboratory for Autonomous Systems Research (LASR) won’t host any battle royales,” admits PopSci, “ but its complex array of high-speed video cameras are designed to automatically swivel and pan to simultaneously track up to 50 drones, ground robots, or human soldiers, capturing all the action as it unfolds.”

The article continues, "Those environments range from coastal areas to windswept deserts to mountainous regions with sheer rock walls. A Southeast Asian rain forest will subject robots to 80-degree temperatures and 80 percent humidity while pounding them with up to six inches of rain. A shoreline analog contains a 6-foot deep pool with a wave generator. Overseers can manipulate the lighting, weather, and other environmental factors to test their subjects however they see fit.”

By constructing elaborately realistic ‘arenas’ to test robotic and human combat operations, LASR combines the immersive qualities of a Hollywood set or theme park, the bleeding-edge technology and discovery found only in the most forward-thinking of research institutes, and the deadly pragmatism of a team who have fully internalized the theory of peace through greater firepower.

LiveScience tells us that “the United States has already fielded more than 12,000 ground robots and more than 7,000 flying drones in regions such as Iraq and Afghanistan.” The article also contrasts the strange but understandable attachment that soldiers can develop with the robots in their midst with the inevitable and chilling disconnect that can afflict remote drone operators.

The Laboratory for Autonomous Systems Research has a Tropical High Bay that is a 60-foot by 40-foot greenhouse, and it contains a re-creation of a southeast Asian rain forest (Photo courtesy of Jamie Hartman, NRL)“The drone operator's war often looks surreal and disconnected from reality, given that they coordinate strikes via online chat and view their targets as small infrared figures moving around. Many media stories have referenced the example of a 19-year-old drone operator, who honed his skills from playing Xbox to become a top operator and eventually an instructor ... still, [Brookings Institution defense analyst Peter] Singer said that the operators ‘know lives are at stake,’ and take pride in the role that they play in helping demoralize the enemy.”

What the article does not dwell upon is the fact that the machines themselves neither share the grunts’ instinct for attachment, nor even the abstract ethics of the drone operator. The machine’s sole claims to ‘personality’ are programming and recurring malfunction. In other words, a robot will do exactly what it is told, to the extent of its ability, and any sign of individuality comes down to unforeseen glitches and bugs to be weeded out. Even with a programmed semblance of concern and social consciousness, the machines will not intuitively mirror the LiveScience article’s examples of soldiers risking their lives to recover downed robots, or of tearful soldiers submitting broken electric pals for repair.         

On a more hopeful note, robots can certainly be programmed to intuitively ‘care,' in the sense that their central mission is to protect humans from harm. Innovation News Daily describes the Navy’s development of a 'robot firefighter,' the Shipboard Autonomous Firefighting Robot (SAFFiR). In merely a year and a half, the SAFFiR will undergo field testing in an even more authentic location than the LASR’s arenas -- the decommissioned USS Shadwell.

Capt. Paul Stewart, NRL's commanding officer is standing next to an android and is holding an Ascending Technologies Pelican quadrotor mini air vehicle (Photo courtesy of Jamie Hartman, NRL)Still, the shadow of technophobia is a long one, insidiously coloring the thoughts of even the most enthusiastic of geeks and futurists (myself included) ... especially in the case of cutting-edge technology being developed for inherently anti-human operations. Despite lauding SAFFiR’s ‘altruistic’ firefighting potential, the IND article cannot resist the inevitable Terminator reference.

Indeed, the Navy (or at least the media outlets who cover them) obviously has great pop culture timing --  in the case of LASR, The Hunger Games is a much more relevant analogy than, say, Twilight or Harry Potter. On the other hand, it may not be so encouraging that an adult population so willing to become enraptured by young-adult diversions (from novels-cum-movies to the aforementioned drone gamer) is also responsible for the development and deployment of technologically advanced systems of surveillance and destruction. Even before the question of artificial intelligence becomes a more immediate concern, absolute human control of killing machines provides sufficient room for discomfort.

Exterior view of the Laboratory for Autonomous Systems Research (Photo courtesy of Jamie Hartman, NRL)But then there’s 'singularity.' The running joke about the machines taking over masks two not-so-funny factors. The first is a reality so prevalent and mundane that we either ignore or choose not to dwell on the implications. Networked, computerized semi-intelligent systems are already filling the space (although not yet the active role) of infrastructure and resource management, finance, emergency response, communications, and -- as we’ve discussed here -- law enforcement and military operations. The machines don’t really need to take over; we’ve voluntarily given them everything that they can use -- materials, energy, information, development, control. We’ll continue to grow them and feed them and teach them to get better until they’re old enough for an adolescent phase of self-discovery. I hope that you get a chuckle out of the idea of 911 systems slamming their bedroom doors and yelling “I hate you!”

That’s the second sobering element. Artificial Intelligence singularity is by nature an event horizon whose outcomes (occurring at the blink of an eye and most likely within the next 20-30 years) cannot be predicted by our existing ‘merely human’ intelligence. The other side of the singularity coin is self-replication via nanotechnology -- literally, the process of self-creation (or the over-emphasized potential for 'gray goo' total destruction) of machines and by machines, on the organic scale. And yes, the Navy has its hands in that too ... 

 

Companies
1
Patents
1