
Researchers’ tiny vision system streams HD video from beetle’s back
By DE Staff
GeneralUW engineers’ low-power, low-weight camera wirelessly captures images from beetle’s point of view.

A tiny camera backpack, developed by University of Washington researchers, riding aboard a Pinacate beetle. (Photo credit: Mark Stone/University of Washington)
“We have created a low-power, low-weight wireless camera system that can capture a first-person view of what’s happening from an actual live insect or create vision for small robots,” said senior author Shyam Gollakota, a UW associate professor in the Paul G. Allen School of Computer Science & Engineering. “Vision is so important for communication and for navigation, but it’s extremely challenging to do it at such a small scale. As a result, prior to our work, wireless vision has not been possible for small robots or insects.”
To reduce power consumption, the researchers used a ultra-low-power black-and-white camera mounted to an arm that moves when a high voltage is applied. The camera and arm are controlled from a smartphone via Bluetooth at up to 120 meters.
The researchers also used their camera system to design the world’s smallest terrestrial, power-autonomous robot with wireless vision. This insect-sized robot uses vibrations to move and consumes almost the same power as low-power Bluetooth radios need to operate.
The research team hope to develop future versions that require less power and be battery free, potentially solar-powered. Potential applications range from biology to exploring novel environments, the researchers said.
“This is the first time that we’ve had a first-person view from the back of a beetle while it’s walking around. There are so many questions you could explore, such as how does the beetle respond to different stimuli that it sees in the environment?” said co-lead author Vikram Iyer, a UW doctoral student in electrical and computer engineering. “But also, insects can traverse rocky environments, which is really challenging for robots to do at this scale. So this system can also help us out by letting us see or collect samples from hard-to-navigate spaces.”
The results were published July 15 in Science Robotics.
www.washington.edu