The "Robat" uses a biological bat-like acoustic approach to navigate, emitting sounds and extracting information from the returning echoes to map its novel surroundings.
By EYTAN HALON
Researchers at Tel Aviv University say they have developed the world's first fully autonomous bat-like robot which uses echolocation to navigate new environments.The "Robat" uses a biological bat-like acoustic approach to navigate, emitting sounds and extracting information from the returning echoes to map its novel surroundings, thereby enabling it to negotiate obstacles and avoid dead ends.Researchers equipped the robot with a ultrasonic speaker serving as its mouth and producing chirps at a frequency and rate typically used by bats, and two ultrasonic microphones serving as the robot's ears."Robat" differs from previous studies as it moves autonomously through its test environments, whereas other bat-like robots are driven by users. The robot also maps the structure of the surrounding environment rather than mapping its own position in relation to the environment.The robot's classification ability also enables the robot to evaluate whether it can bypass an obstacle such as a plant rather than a wall. A similar classification process is critical for a real bat to assist in both navigation and foraging for food.Built and developed by graduate student Itamar Eliakim, and advised by Prof. Yossi Yovel and Dr. Gabor Kosa, the team's findings were published in biology journal PLOS Computational Biology on Thursday.“Our Robat is the first fully autonomous, bat-like biorobot that moves through a novel environment while mapping it solely based on echo information," said Eliakim, who successfully tested the robot in two outdoor environments at the Tel Aviv University Botanical Garden."This information delineates the borders of objects and the free paths between them. We’ve been able to demonstrate the great potential of using sound in future robotic applications.”The robot, its developers say, could have great potential for future robotic applications due to the growing use of autonomous robots, which require new sensory approaches to plan routes and avoid objects in unknown environments."Today, robots primarily navigate using machine vision, using cameras and lasers. We have proved that it is possible to do interesting things with sonar too," explained Prof. Yovel.
"Vision is an excellent sense, but it has its defects. For example, if a robot is navigating in the dark, dust or smoke - such as underneath rubble or in a fire... This advancement is likely to have considerable ramifications for developing multisensory robots, just like humans have multiple senses."