U.S. Army foresees robots becoming squad members

Autonomous, bomb-sensing vehicles and personal robotic assistants could transform teams of soldiers

July 25, 2013 06:00 AM ET

Computerworld - The U.S. Army wants to move from using robots as tools to creating a human-robot cooperative that will make machines trusted members of the military.

"The issue is can I have a squad augmented with robots cover more ground, be more effective, do more things on a 72-hour operation than they can today?" asked Lt. Col. Stuart Hatfield, Branch Chief of Soldier Systems and Unmanned Ground Systems with the Army. "We see a transition from a dumb robot being a tool to it becoming a member of the team. Do I have a robot that carries my stuff, or do I have a robot that is a member of the squad?"

The Army is using robots, such as semi-autonomous vehicles, mainly as dumb tools. However, the military has a different vision of the future of robotics and how the machines will fit in a soldier's life at home base, as well as on the battlefield.

Segway 440X robot

In 20 to 40 years, humanoid robots, using human tools, could precede soldiers into dangerous areas, performing tasks such as turning a wrench to open valves, opening doors and climbing ladders. Some day, the Army might send autonomous robots into battle to physically engage with the enemy.

While that scenario is likely decades away, the Army is working on semi-autonomous vehicles that can lead convoys and scan for IEDs (improvised explosive devices), robotic exoskeletons that can help soldiers move faster and longer, and wheeled robots that can carry soldiers' heavy packs, freeing troops to be more agile and less fatigued.

"It's about maintaining overmatch," Hatfield said. "The saying is, 'We never want to go into a fair fight.' You want everything to your advantage. If you're wearing 40 or 50 pounds of body armor, and you have 100 pounds on your back and you're chasing a guy in flip flops up a hill, you're at a disadvantage already. We want to lighten the load for the soldier."

The Army has tried to trim some of that weight, making lighter body armor, helmets and night-vision sensors, but they've hit a wall with that effort.

"We're at a place where there's not a lot more to save, and soldiers are still carrying 100-to-120-pound backpacks," Hatfield said. "Soldiers can carry a certain amount so they're forced to make trades. Do I carry extra ammo or water? If we can no longer lighten the load that a soldier has to carry, we have to look at off-loading that load for him."

If the soldier doesn't have to worry about how much he can carry, he can take extra water and ammunition, along with an extra pair of night-vision sensors.

The Army's plan is that a robot will make that happen.

Last year, for instance, the Army deployed four Lockheed Martin Squad Mission Support Systems to support a squad in Afghanistan as a test. Each semi-autonomous robot, which was derived from a six-wheel all-terrain vehicle that had a flat bed and no seats, was designed to carry 1,200 pounds of soldiers' gear.

The test had mixed results.

Hatfield reported that "soldiers being soldiers," the troops loaded up the robotic vehicles with as much as 4,000 pounds of gear and then complained the vehicles weren't fast enough.

While the vehicles were designed to be able to drive themselves, the soldiers never let them run autonomously.

"As they moved around the battlefield, they used it to carry sand bags and pickets and wire and water," Hatfield said. "The soldier had to use a controller to drive it forward... They did not trust it, so they never used the autonomous aspect. They didn't want it running over someone."

large vehicular robot
A large vehicular robot follows a squad, potentially carrying ammunition, water and other supplies. (Photo: 5D Robotics)

Greg Hudas, the Army's chief engineer for ground vehicle robotics, told Computerworld that soldiers' trust is critical to having robots work with the squads.

"If the soldier doesn't trust it as a teammate, the soldier won't use the technology and we're back to square one," Hudas said. "There has to be an element of trust. Those squads are very delicate structures. The machines have to fit in perfectly."

The Army's vision is to make robotics part of the unit, but that is going to take trust and a whole new level of human-robot cooperation.

"We want to make it seamless. We want to make a robot an actual squad member," Hudas said. "And whether it's a human or a machine, we want to make it transparent. Each member in a squad has a set of duties on a mission. If we replace a squad member with a robot, we want people to feel comfortable with the robot acting as a teammate. That involves some trust and performance issues. That robot has to be able to keep up with them."

The Army is working on a robot that would serve in a critical, potentially life-saving capacity.

One project is an autonomous vehicle that will lead military convoys in order to search for IEDs in the road. If a robotic vehicle finds an explosive in the road, another robot would dig it out, protecting the soldiers further back in the convoy from deadly explosions, Hudas said.

The Army also is working on semi-autonomous vehicles that would allow a driver to step outside the vehicle or perform other tasks while the vehicle takes over driving, Hudas said.

"We need to test for reliability and failure and see how the humans interact with it," he added. "It's not only about the human interaction with the machine but the machine needs to interact with the human."

For instance, the vehicle might be driving itself when it encounters an obstacle that it's not sure how to get around. If the robot warns the driver that it needs help but the soldier isn't able to take over, the vehicle needs to know when it has to handle the situation itself.

"We're looking at the vehicle being able to decide when to assume responsibility," said Hudas. "We're looking into the problem of the machine understanding the consciousness of humans. Are they drowsy or are they so intent on another task that if they take control of the vehicle, will it be dangerous? The interaction needs to be tightly coupled between the human and the machine."

Hudas said the Army is probably five to 10 years away from having a robotic vehicle that could make its own decisions.

To get some of the "smarts" into the robots, the Army is working with 5D Robotics Inc., a robotics software company, which in turn is working with DRS Technologies and Segway Inc. 5D said it is trying to integrate human behaviors into robots, such as robotic assistants that carry soldier's packs or small wheeled robots the size of a big shoe box that can carry cameras into dangerous areas.

Jackie Fenn, an analyst with Gartner Inc., said the hardest part about building military robots will be making them able to move easily and quickly over tough, often dangerous, terrain. That, she added, will be harder to do than making them smart enough to act autonomously.

"I do like that notion of the robotic assistant," she said. "What work you can offload to robots is a very promising angle... But trust is critical. You really get that by having it work. When humans see that there are things they wouldn't be able to do without a robot, that's when the real change in thinking happens. If you can send a robot in to check out a building and keep a soldier back and safe, then that really adds value."

This article, U.S. Army foresees robots becoming squad members, was originally published at Computerworld.com.

covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at Twitter @sgaudin, on or subscribe to Sharon's RSS feed Gaudin RSS. Her email address is sgaudin@computerworld.com.

See more by Sharon Gaudin on Computerworld.com.

Read more about Emerging Technologies in Computerworld's Emerging Technologies Topic Center.

REALITY CHEK; LIKELY SUPER INTELLIGENT COMPUTORS WILL SOON BE TELLING HUMANS WHAT TO DO, KILLING THEM, ASSUMING MORE AND MORE AUTONOMY, AND EVENTUALLY HUMANS WILL BECOME EXPENDABLE AND OUT-DATED FOR THE FUTURE ROBOT WORLD. EDITOR -  like in the movies Terminator

You need to be a member of puredevoteeseva to add comments!

Join puredevoteeseva

Votes: 0
Email me when people reply –

Replies

  • Professor: Robots to Patrol Cities by 2040

    •   The Alex Jones Channel Alex Jones Show podcast Prison Planet TV Infowars.com Twitter Alex Jones' Facebook Infowars store

    “What is your ID number? What are you doing here?”

    Paul Joseph Watson
    Infowars.com
    July 25, 2013

    Robots will be patrolling cities by 2040 according to Professor Noel Sharkey, who predicts their tasks will include asking for ID, tasering and arresting suspects as well as crowd control.

    Image: DARPA

    In an article entitled 2084: Big robot is watching you, Sharkey, a robotics professor at the University of Sheffield, forecasts a world in which the jobs of surveillance, security and law enforcement have largely been handed over to artificial intelligence.

    WIthin the next 30 years, Sharkey asserts that, “Humanoid walking robots would be more in use for crowd control at games, strikes and riots. Robots will patrol city centres and trouble spots where fights are likely to break out.”

    “Robots will have reasonable speech perception and be able to ask questions and respond to answers. What is your ID number? What are you doing here? Move along. They may work in teams of tracked robots with non-lethal weapons (e.g. Tasers or nets) and be on call for diffusing difficult situations and arresting people,” adds Sharkey.

    As well as performing more mundane tasks like checking tickets and throwing people out of events, robots will also “be able to spray a crowd with RFID tag darts or some futuristic equivalent so that people can be tracked after the crowd has been dispersed,” writes Sharkey.

    By 2070, the professor predicts that robots will take on a human appearance and will be able to deploy swarm intelligence technology that will “make escape from capture impossible.” Robot police cars will also roam streets scanning license plates and deduct speeding fines from bank accounts automatically.

    Sharkey has become a prominent voice in warning about how DARPA’s fleet of robots, which are ostensibly being developed for “humanitarian” and “emergency response” purposes, are in fact being designed to kill.

    “Of course if it’s used for combat, it would be killing civilians as well as it’s not going to be able to discriminate between civilians and soldiers,” Sharkey told the BBC.

    His warning has been echoed by Human Rights Watch, as well as former intelligence officer Lt. Col. Douglas Pryer, who wrote an essay warning of the threat posed by remorseless “killer robots” that will be used to stalk and slaughter human targets in the near future.

    Last year, experts at the prestigious University of Cambridge announced a project to conduct research into the “extinction-level risks” posed to humanity by artificially intelligent robots.

    As we reported in April, Pentagon scientists have already constructed a machine that functions like a human brain and would enable robots to think independently and act autonomously.

    Watch a recent TedX lecture given by Sharkey below in which he warns about how robots will inevitably be used as killing machines in the near future and will also “destabilize world security and trigger unintentional wars”.

  • Technology would make robots “truly autonomous”

    Paul Joseph Watson
    Infowars.com
    April 11, 2013

    A Pentagon-funded team of scientists have constructed a machine that functions like a human brain and would enable robots to think independently and act autonomously.

    Image: YouTube

    Researchers for DARPA (Defense Advanced Research Projects Agency) have created a device that “looks and ‘thinks’ like a human brain,” James K. Gimzewski, professor of chemistry at the University of California, Los Angeles, told National Defense Magazine.

    The program is called “physical intelligence” and is capable, “without being programmed like a traditional robot, of performing actions similar to humans,” making it the first incarnation of a robot that can perform “truly autonomously” without human input.

    “What sets this new device apart from any others is that it has nano-scale interconnected wires that perform billions of connections like a human brain, and is capable of remembering information,” writes Sandra I. Erwin. “Each connection is a synthetic synapse. A synapse is what allows a neuron to pass an electric or chemical signal to another cell. Because its structure is so complex, most artificial intelligence projects so far have been unable to replicate it.”

    The technology would allow drones to be created that do not need human operators, machines that would be able to learn and navigate through terrain completely of their own accord.

    According to Erwin, it is not yet confirmed whether the Pentagon will look to apply the technology to weapons systems.

    However, given the fact that the vast majority of DARPA’s work in robotics is geared towards creating an army of battlefield soldiers, it’s not a huge leap to make.

    Numerous experts have warned that robots currently being developed in the name of humanitarian assistance will ultimately be used to kill enemy soldiers and accused terrorists.

    Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield, has repeatedly warned that the robots currently being developed under the auspices of DARPA will eventually be used to kill.

    “Of course if it’s used for combat, it would be killing civilians as well as it’s not going to be able to discriminate between civilians and soldiers,” said Sharkey.

    Last month, award-winning military writer and former intelligence officer Lt. Col. Douglas Pryer also wrote an essay warning of the threat posed by remorseless “killer robots” that will be used to stalk and slaughter human targets in the near future.

    In a 50-page report published last year, Human Rights Watch also warned that artificially intelligent robots let loose on the battlefield would inevitably commit war crimes.

    Last year, experts at the prestigious University of Cambridge announced a project to conduct research into the “extinction-level risks” posed to humanity by artificially intelligent robots.

    Flying drones that communicate with each other are also being developed for “hunting terrorists” and other “homeland security” purposes, as well as UAVs that could one day snatch humans off the street.

    The fact that DARPA’s latest robotic creation looks human is also not going to do anything to dampen fears about the “rise of the machines”.

    *********************

    Paul Joseph Watson is the editor and writer for Infowars.com and Prison Planet.com. He is the author of Order Out Of Chaos. Watson is also a host for Infowars Nightly News.

This reply was deleted.