Background colour

PREVIEW

ID: 53977142 Video

Headline: RAW VIDEO: Insects Show Next Generation Of Tiny Robots How To Get Around

Caption:

**VIDEO AVAILABLE: CONTACT INFO@COVERMG.COM TO RECEIVE**
Scientists believe insects could hold the key to a world where futuristic mini-robots can complete important tasks.
Imagine tiny robots, each weighing no more than a few hundred grams, buzzing around a greenhouse to tackle pests and provide nutrients. Or darting through narrow passages to help with search-and-rescue missions.
Mini marvels could revolutionise numerous fields but are yet to be widely adopted. Why? Because of navigational issues. Making tiny robots autonomously navigate is a significant challenge as traditional navigation systems, designed for larger robots, rely on hefty, power-hungry sensors like LiDAR or create detailed 3D maps. Both are impractical for small robots, while external navigational systems like GPS aren’t reliable where finding signal is a struggle.
That’s why scientists from Delft University of Technology are now looking insects, whose navigation methods offer a blueprint for overcoming these obstacles. IInsects navigate using a combination of “odometry” (tracking their own motion) and “view memory” (relying on low-resolution snapshots of their surroundings). Picture Hansel and Gretel’s breadcrumbs, but with a high-tech twist. Instead of breadcrumbs, these robots use visual snapshots to find their way back home.
Tom van Dijk, first author of the Delft study, says: “As with a stone, for a snapshot to work, the robot has to be close enough to the snapshot location. If the visual surroundings get too different from that at the snapshot location, the robot may move in the wrong direction and never get back anymore. Hence, one has to use enough snapshots – or in the case of Hansel drop a sufficient number of stones. On the other hand, dropping stones to close to each other would deplete Hans’ stones too quickly. In the case of a robot, using too many snapshots leads to large memory consumption. Previous works in this field typically had the snapshots very close together, so that the robot could first visually home to one snapshot and then to the next.”
His co-author, Guido de Croon, adds: “The main insight underlying our strategy is that you can space snapshots much further apart, if the robot travels between snapshots based on odometry. Homing will work as long as the robot ends up close enough to the snapshot location, i.e., as long as the robot’s odometry drift falls within the snapshot’s catchment area. This also allows the robot to travel much further, as the robot flies much slower when homing to a snapshot than when flying from one snapshot to the next based on odometry.”
The proposed insect-inspired navigation strategy allowed a 56-gram “CrazyFlie” drone, equipped with an omnidirectional camera, to cover distances of up to 100 meters with only 0.65 kiloByte. All visual processing happened on a tiny computer called a “micro-controller”, which can be found in many cheap electronic devices.
De Croon believes that the method his team employed could be more effective for small robots as it does not waste memory on creating elaborate navigational systems when simple ones will do.
“The functionality of the proposed strategy is more limited than that provided by state-of-the-art navigation methods. It does not generate a map and only allows the robot to come back to the starting point,2 he explains. “Still, for many applications this may be more than enough. For instance, for stock tracking in warehouses or crop monitoring in greenhouses, drones could fly out, gather data and then return to the base station. They could store mission-relevant images on a small SD card for post-processing by a server. But they would not need them for navigation itself.”

Keywords: robots,tech,insects,photo,feature,video

PersonInImage: