A famous viral video about the DARPA Robotics Challenge shows all sorts of humanoid robots clumsily falling down. Bipedal movement is rather unstable, which is not only a problem for a robot trying to complete its task, but also because falling can damage a very expensive piece of machinery.
Roboticists across the globe are tackling this problem in a myriad of ways. While some look to add a series of corrective steps after a robot becomes off-balance, much like a person stumbling after tripping, Kris Hauser wants robots to be able to use the environment around them.
“If a person gets pushed toward a wall or a rail, they’ll be able to use that surface to keep themselves upright with their hands. We want robots to be able to do the same thing,” said Kris Hauser, associate professor of electrical and computer engineering and of mechanical engineering and materials science at Duke. “We believe that we’re the only research group working on having a robot dynamically choose where to place its hands to prevent falling.”
While such decisions and actions are second nature to us, programming them into a robot’s reflexes is deceptively difficult. To streamline the process and save computation time, Hauser programs the software to focus only on the robot’s hip and shoulder joints.
As long as the robot isn’t twisting as it falls, this creates only three angles that the stabilization algorithm has to take into account—the foot to the hip, the hip to the shoulder, and the shoulder to the hand. The robot must identify nearby surfaces within reach and then quickly calculate the best combination of angles to catch itself. The final solution minimizes impact when the robot’s hands make contact, and also minimizes the chance of its hands or feet slipping. The algorithm takes its best guess and then progressively optimizes it using a method called direct shooting.
In its current state, the robot has information about its environment fed to it and can’t navigate on its own. But in the near future, Hauser plans to upgrade to a larger robot with its own camera sensors to let it see its surroundings.
“Hopefully by the end of the year we should be doing experiments with the robot actually working in a live obstacle course,” Hauser said. “Then we’ll be trying to have the robot both dynamically map what’s around it and reason about how to protect itself from falling in arbitrary environments.”