Collision avoidance is one of the most important basic qualities of self-driving cars and robots, but some researchers think they should be allowed to slightly break safety routines in order to operate smoothly.
Magnus Egerstedt, a robotics expert at Georgia Tech, said:
When you put a lot of robots together, they try to avoid bumping into each other, and as a result, no one can move.
After being taken over by safe behavior, all the robots froze, and it became impossible to move them around to work elsewhere, like an invisible bubble separating them.
Similar to other studies of “robot swarm behavior,” Egerstedt’s team developed an algorithm that allows a small group of robots to quickly cross a limit without crashing into one another.
Essentially, each robot works through a set of safe states and barrier certifications, but with minimal disruption as its primary goal. The video below shows the system in action, notice how the four robots are synchronized.
The principle is the same for the eight robots working together, the researchers said. Even if one rogue robot doesn’t follow the rules, the other robots will adapt to its “wildness” and continue to maintain their distance and set goals.
Despite the small probability of accidents, the safety record of self-driving cars has so far been excellent (albeit with a relatively limited sample size). The safety system helps minimize accidents on public roads as more self-driving cars hit the road.
The team will be present at the Institute of Electrical and Electronics Engineers’ Decision and Control Conference (IEEE Co.nference on Decision and Control) for his research paper.
Source: Georgia Tech]
The Links: SS822Z 3HAC024144-001