Techno Blender
Digitally Yours.

Dancing bees inspire alternative communication system for robots

0 71


We’ve heard about robots that communicate with one another via wireless networks, in order to collaborate on tasks. Sometimes, however, such networks aren’t an option. A new bee-inspired technique gets the bots to “dance” instead.

Since honeybees have no spoken language, they often convey information to one another by wiggling their bodies.

Known as a “waggle dance,” this pattern of movements can be used by one forager bee to tell other bees where a food source is located. The direction of the movements corresponds to the food’s direction relative to the hive and the sun, whereas the duration of the dance indicates the food’s distance from the hive.

Inspired by this behaviour, an international team of researchers set out to see if a similar system could be used by robots and humans in locations such as disaster sites, where wireless networks aren’t available.

In the proof-of-concept system the scientists created, a person starts by making arm gestures to a camera-equipped Turtlebot “messenger robot.” Utilizing skeletal tracking algorithms, that bot is able to interpret the coded gestures, which relay the location of a package within the room. The wheeled messenger bot then proceeds over to a “package handling robot,” and moves around to trace a pattern on the floor in front of that bot.

As the package handling robot watches with its own depth-sensing camera, it ascertains the direction in which the package is located based on the orientation of the pattern, and it determines the distance it will have to travel based on how long it takes to trace the pattern. It then travels in the indicated direction for the indicated amount of time, then uses its object recognition system to spot the package once it reaches the destination.

In tests performed so far, both robots have accurately interpreted (and acted upon) the gestures and waggle dances approximately 93 percent of the time.

The research was led by Prof. Abhra Roy Chowdhury of the Indian Institute of Science, and PhD student Kaustubh Joshi of the University of Maryland. It is described in a paper that was recently published in the journal Frontiers in Robotics and AI.

Source: Frontiers




We’ve heard about robots that communicate with one another via wireless networks, in order to collaborate on tasks. Sometimes, however, such networks aren’t an option. A new bee-inspired technique gets the bots to “dance” instead.

Since honeybees have no spoken language, they often convey information to one another by wiggling their bodies.

Known as a “waggle dance,” this pattern of movements can be used by one forager bee to tell other bees where a food source is located. The direction of the movements corresponds to the food’s direction relative to the hive and the sun, whereas the duration of the dance indicates the food’s distance from the hive.

Inspired by this behaviour, an international team of researchers set out to see if a similar system could be used by robots and humans in locations such as disaster sites, where wireless networks aren’t available.

In the proof-of-concept system the scientists created, a person starts by making arm gestures to a camera-equipped Turtlebot “messenger robot.” Utilizing skeletal tracking algorithms, that bot is able to interpret the coded gestures, which relay the location of a package within the room. The wheeled messenger bot then proceeds over to a “package handling robot,” and moves around to trace a pattern on the floor in front of that bot.

As the package handling robot watches with its own depth-sensing camera, it ascertains the direction in which the package is located based on the orientation of the pattern, and it determines the distance it will have to travel based on how long it takes to trace the pattern. It then travels in the indicated direction for the indicated amount of time, then uses its object recognition system to spot the package once it reaches the destination.

In tests performed so far, both robots have accurately interpreted (and acted upon) the gestures and waggle dances approximately 93 percent of the time.

The research was led by Prof. Abhra Roy Chowdhury of the Indian Institute of Science, and PhD student Kaustubh Joshi of the University of Maryland. It is described in a paper that was recently published in the journal Frontiers in Robotics and AI.

Source: Frontiers

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment