Autonomous Drones | Next Wave In Robotic Wars


The next wave in robotic war: autonomous drones

by Dan De Luce

Agençe France-Presse

U.S. military helicopter

Piloted U.S. military aircraft may soon be a thing of the past as technology for autonomous drones improves.

Credit: iStockphoto

WASHINGTON: The U.S. military’s current fleet of drones will soon be overtaken by a new wave of robots that will be faster, stealthier and smarter – operating virtually without human intervention, experts say.

The Pentagon is investing heavily in ‘autonomy’ for robotic weapons, with researchers anticipating squadrons of drones in the air, land or sea that would work in tandem with manned machines – often with a minimum of supervision.

“Before they were blind, deaf and dumb. Now we’re beginning to make them to see, hear and sense,” Mark Maybury, chief scientist for the U.S. Air Force, said.

Unmanned aircraft are now overseen by ‘pilots’ on the ground, but as the drones become more sophisticated, the role of remote operators will be more hands-off.

From operator to supervisor

Instead of being “in the loop”, humans will be “on the loop”, said Maybury, explaining that operators will be able to “dial in” when needed to give a drone direction for a specific task.

“We’re moving into more and more autonomous systems. That’s an evolutionary arc,” said Peter Singer, an expert on robotic weapons and author of Wired for War.

“So the role moves from being sort of the operator from afar, to more like the supervisor or manager, and a manager giving more and more of a leash, more and more independence,” he said.

Despite the dramatic advances in technology, the American military insists humans will remain in control when it comes to using lethal force.

Lawyers aren’t ready for this

But the next generation of increasingly capable drones will stretch man’s capacity to control robots in battle, generating unprecedented moral and legal quandaries.

“These [technological] responses that are driven by science, politics and battlefield necessity get you into areas where the lawyers just aren’t ready for it yet,” Singer said.

Over the next decade, changes in computing power will enable teams of hi-tech drones to operate virtually on their own, or as ‘robotic wingmen’ to piloted aircraft, said Werner Dahm, the U.S. Air Force’s former top scientist.

At a testing range in the Arizona desert, Apache helicopters are flying together with unmanned choppers in experiments the Pentagon believes will serve as an eventual model for future warfare.

‘Alone and unafraid’ doctrine turned upside down

“We’re not far away from having a single piloted Apache or other helicopter system and a larger number of unmanned systems that fly with that,” said Dahm, a professor of mechanical and aerospace engineering at Arizona State University.

“These require very high levels of machine reasoning. We’re much closer to that than most people realise,” Dahm said.

The new technology has turned the U.S. Air Force’s doctrine upside down. For decades, the military trained pilots to face an enemy “alone and unafraid,” flying deep into hostile territory to strike at a target and then return home.

Now the Air Force is planning for scenarios in which different tasks would be divided up among manned and unmanned ‘systems’, with drones jamming enemy air defences, tracking targets and assessing bomb damage, while piloted warplanes oversee the launching of bombs and missiles.

“It’s difficult to prove something won’t go wrong”

Instead of the slow-flying turbo-prop Predator, future drones will likely more closely resemble their manned counterparts, with a longer range, more powerful jet engines and radar-evading stealth design, which the bat-winged Sentinel drone already has pioneered.

But the biggest technical hurdle for Pentagon-funded scientists is delivering an iron-clad guarantee that the more autonomous vehicles will not make a grievous mistake with potentially catastrophic consequences.

“You have to be able to show that the system is not going to go awry – you have to disprove a negative,” Dahm said. “It’s very difficult to prove that something won’t go wrong.”

Emotionless, ethical warriors

One veteran robotics scientist, Ronald Arkin, a professor at the Georgia Institute of Technology, believes that countries will inevitably deploy independent robots capable of killing an enemy without a human pushing a button.

Arkin, who has worked on U.S. defence programs for years, argues that robotic weapons can and should be designed as “ethical” warriors, with the ability to distinguish combatants from innocent civilians.

Without emotions to cloud their judgment and anger driving their actions, the robots could wage war in a more restrained, “humane” way, in accordance with the laws of war, Arkin said.

“It is not my belief that an unmanned system will be able to be perfectly ethical in the battlefield, but I am convinced that they can perform more ethically than human soldiers are capable of,” he wrote.

Scary! Robots Will Control Us All!


Perhaps the scariest article you’ll read all year (robots will soon control us all)

Robots, Robotics, Artificial Intelligence, AI, Rise of the Machines, Rise of the Robots:-

If this is the fu­ture of war­fare and in­tel­li­gence gath­er­ing, rest as­sured it won’t only be Wash­ing­ton doing it.

Last month philoso­pher Patrick Lin de­liv­ered this brief­ing about the ethics of drones at an event hosted by In-Q-Tel, the CIA’s ven­ture-cap­i­tal arm (via the At­lantic):

Let’s look at some cur­rent and fu­ture sce­nar­ios. These go be­yond ob­vi­ous in­tel­li­gence, sur­veil­lance, and re­con­nais­sance (ISR), strike, and sen­try ap­pli­ca­tions, as most ro­bots are being used for today. I’ll limit these sce­nar­ios to a time hori­zon of about 10-15 years from now.

Mil­i­tary sur­veil­lance ap­pli­ca­tions are well known, but there are also im­por­tant civil­ian ap­pli­ca­tions, such as ro­bots that pa­trol play­grounds for pe­dophiles (for in­stance, in South Korea) and major sport­ing events for sus­pi­cious ac­tiv­ity (such as the 2006 World Cup in Seoul and 2008 Bei­jing Olympics). Cur­rent and fu­ture bio­met­ric ca­pa­bil­i­ties may en­able ro­bots to de­tect faces, drugs, and weapons at a dis­tance and un­der­neath cloth­ing. In the fu­ture, robot swarms and “smart dust” (some­times called nanosen­sors) may be used in this role.

Ro­bots can be used for alert­ing pur­poses, such as a hu­manoid po­lice robot in China that gives out in­for­ma­tion, and a Russ­ian po­lice robot that re­cites laws and is­sues warn­ings. So there’s po­ten­tial for ed­u­ca­tional or com­mu­ni­ca­tion roles and on-the-spot com­mu­nity re­port­ing, as re­lated to in­tel­li­gence gath­er­ing.

In de­liv­ery ap­pli­ca­tions, SWAT po­lice teams al­ready use ro­bots to in­ter­act with hostage-tak­ers and in other dan­ger­ous sit­u­a­tions. So ro­bots could be used to de­liver other items or plant sur­veil­lance de­vices in in­ac­ces­si­ble places. Like­wise, they can be used for ex­trac­tions too. As men­tioned ear­lier, the BEAR robot can re­trieve wounded sol­diers from the bat­tle­field, as well as han­dle haz­ardous or heavy ma­te­ri­als. In the fu­ture, an au­tonomous car or he­li­copter might be de­ployed to ex­tract or trans­port sus­pects and as­sets, to limit US per­son­nel in­side hos­tile or for­eign bor­ders.

In de­ten­tion ap­pli­ca­tions, ro­bots could also be used to not just guard build­ings but also peo­ple. Some ad­van­tages here would be the elim­i­na­tion of prison abuses like we saw at Guan­tanamo Bay Naval Base in Cuba and Abu Ghraib prison in Iraq. This speaks to the dis­pas­sion­ate way ro­bots can op­er­ate. Re­lat­edly–and I’m not ad­vo­cat­ing any of these sce­nar­ios, just spec­u­lat­ing on pos­si­ble uses–ro­bots can solve the dilemma of using physi­cians in in­ter­ro­ga­tions and tor­ture. These ac­tiv­i­ties con­flict with their duty to care and the Hip­po­cratic oath to do no harm. Ro­bots can mon­i­tor vital signs of in­ter­ro­gated sus­pects, as well as a human doc­tor can. They could also ad­min­is­ter in­jec­tions and even in­flict pain in a more con­trolled way, free from mal­ice and prej­u­dices that might take things too far (or much fur­ther than al­ready).