Industry-Leading Sensor to Show the Way Amid Simulated Disasters
Morgan Hill, Calif. (Dec. 10, 2014) – RoboSimian, a headless, ape-like robot, is now in training for the 2015 DARPA Robotics Challenge – and looks to be one “primate” with remarkable vision.
Credit in part Velodyne LiDAR, whose HDL-32E LiDAR (Light Detection and Ranging) sensor will ride atop the unit. The robot – created by NASA’s Jet Propulsion Laboratory for the 2015 DARPA Robotics Challenge (DRC), a competition consisting of several disaster-related tasks for robots to perform – uses Velodyne’s spinning LiDAR sensor as a key element of the robot’s perception system. The sensor, which is capable of rotating a full 360° up to 20 times per second, enables the robot to look between 10° up and 30° down. In the trials round last December, the JPL team won a spot to compete in the finals, which will be held in Pomona, Calif., in June 2015.
In the finals, the robot will be faced with such tasks as driving a vehicle and getting in and out of it, negotiating debris blocking a doorway, cutting a hole in a wall, opening a valve and crossing a field with cinderblocks or other debris. Organizers have also promised a surprise task.
RoboSimian moves around on four limbs, making it best suited to travel over complex terrain, including true climbing. Each robot in the Challenge has an "inventory" of objects with which it can interact. Engineers have to program the robots to recognize these objects and perform pre-set actions on them, such as turning a valve or climbing over blocks. See: https://www-robotics.jpl.nasa.gov/roboticVideos/vid1016-152-video.mp4, https://www-robotics.jpl.nasa.gov/roboticVideos/vid1016-154-video.mp4.
"The NASA/JPL robot was developed expressly to go where humans could not, so the element of sight – in this case, LiDAR-generated vision – is absolutely critical,” said Wolfgang Juchmann, Ph.D., Velodyne Director of Sales & Marketing. “We’re recognized worldwide for developing real-time LiDAR sensors for all kinds of autonomous applications, including 3D mapping and surveillance as well as robotics. With a continuous 360-degree sweep of its environment, our lightweight sensors capture data at a rate of almost a million points per second, within a range of 100 meters – ideal for taking on obstacle courses, wherever they may be.”
JPL researchers are currently working on getting RoboSimian to walk more quickly. Partners at the California Institute of Technology and the University of California, Santa Barbara are collaborating on enhancing the robot’s walking speeds.
About the DARPA Robotics Challenge
According to the Department of Defense, some disasters, due to grave risks to the health and wellbeing of rescue and aid workers, prove too great in scale or scope for timely and effective human response. The DARPA Robotics Challenge (http://www.theroboticschallenge.org) seeks to address the problem by promoting innovation in human-supervised robotic technology for disaster-response operations. The primary technical goal of the DRC is to develop human-supervised ground robots capable of executing complex tasks in dangerous, degraded, human-engineered environments. Competitors in the DRC are developing robots that can utilize standard tools and equipment commonly available in human environments, ranging from hand tools to vehicles. To achieve its goal, the DRC is advancing the state of the art of supervised autonomy, mounted and dismounted mobility, and platform dexterity, strength, and endurance. Improvements in supervised autonomy, in particular, aim to enable better control of robots by non-expert supervisors and allow effective operation despite degraded communications (low bandwidth, high latency, intermittent connection). The California Institute of Technology manages JPL for NASA.
About Velodyne LiDAR
Founded in 1983 and based in California’s Silicon Valley, Velodyne Acoustics, Inc. is a diversified technology company known worldwide for its high-performance audio equipment and real-time LiDAR sensors. The company’s LiDAR division evolved after founder and inventor David Hall competed in the 2004-05 DARPA Grand Challenge using stereovision technology. Based on his experience during this challenge, Hall recognized the limitations of stereovision and developed the HDL64 high-resolution LiDAR sensor. More recently, Velodyne has released its smaller, lightweight HDL 32E sensor, available for many applications including UAVs. Since 2007, Velodyne’s LiDAR division has emerged as a leading developer, manufacturer and supplier of real-time LiDAR sensor technology used in a variety of commercial applications including autonomous vehicles, vehicle safety systems, 3D mobile mapping, 3D aerial mapping and security. For more information, visit www.velodynelidar.com. For the latest information on new products and to receive Velodyne’s newsletter, register here.