MIT team creates NanoMap drone modelling system that accounts for uncertainty
Researchers created a system that is aware of uncertainty providing drones a much higher level of reliability in terms of being able to fly in close quarters and avoid obstacles.
When it comes to navigating challenging terrain at high speeds, drones are somewhat limited in their capabilities. Current programming options rely on intricate maps to tell drones where they are in relation to obstacles. However, in a real-world setting, this isn’t practical, especially in light of the unpredictable nature of life, where obstacles are not always fixed and can move without notice. And, with even a slight variation in location, a crash can occur.
A team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is looking to tackle this issue with their latest development, NanoMap. This system was designed to allow drones to consistently fly 20 mph through dense environments.
The team focused on making the drone’s position in the world over time to be uncertain — the system can then model and account for the uncertainty. The NanoMap uses a depth-sensing system to stitch together a series of measurements about the drone’s immediate surroundings, enabling the drone to make motion plans for its current field of view and anticipate moving around the hidden fields of view that have already been seen.
The system does this by essentially saving all images in it’s “brain” and recalling these individual places in order to plan out motion paths.
“Overly confident maps won’t help you if you want drones that can operate at higher speeds in human environments,” says graduate student Pete Florence, lead author on a new related paper. “An approach that is better aware of uncertainty gets us a much higher level of reliability in terms of being able to fly in close quarters and avoid obstacles.
In order to demonstrate the challenges of uncertainty, the team tested a number of different scenarios. For example, if NanoMap wasn’t modeling uncertainty and the drone drifted just five percent away from where it was expected to be, the drone would crash more than once every four flights. However, when it accounted for uncertainty, the crash rate reduced to two per cent.
The team designed the system to map in more general terms, rather than relying on hundreds of different measurements. Instead, the NanoMap operates under the assumption that, to avoid an obstacle, you can simply gather enough information to know that the object is in a general area.
“The key difference to previous work is that the researchers created a map consisting of a set of images with their position uncertainty rather than just a set of images and their positions and orientation,” says Sebastian Scherer, a systems scientist at Carnegie Mellon University’s Robotics Institute. “Keeping track of the uncertainty has the advantage of allowing the use of previous images even if the robot doesn’t know exactly where it is and allows in improved planning.”
Florence describes NanoMap as the first system that enables drone flight with 3D data where the drone considers that it doesn’t perfectly know its position and orientation as it moves through the world.
NanoMap is particularly effective for smaller drones moving through smaller spaces, and works well in tandem with a second system that is focused on more long-horizon planning.
“The researchers demonstrated impressive results avoiding obstacles and this work enables robots to quickly check for collisions,” says Scherer. “Fast flight among obstacles is a key capability that will allow better filming of action sequences, more efficient information gathering and other advances in the future.”
The paper was co-written by Florence and MIT professor Russ Tedrake alongside research software engineers John Carter and Jake Ware. It was recently accepted to the IEEE International Conference on Robotics and Automation (ICRA), which takes place in May in Brisbane, Australia.