Free Newsletter
Register for our Free Newsletters
Newsletter
Zones
Advanced Composites
LeftNav
Aerospace
LeftNav
Amorphous Metal Structures
LeftNav
Analysis and Simulation
LeftNav
Asbestos and Substitutes
LeftNav
Associations, Research Organisations and Universities
LeftNav
Automation Equipment
LeftNav
Automotive
LeftNav
Biomaterials
LeftNav
Building Materials
LeftNav
Bulk Handling and Storage
LeftNav
CFCs and Substitutes
LeftNav
Company
LeftNav
Components
LeftNav
Consultancy
LeftNav
View All
Other Carouselweb publications
Carousel Web
Defense File
New Materials
Pro Health Zone
Pro Manufacturing Zone
Pro Security Zone
Web Lec
Pro Engineering Zone
 
 
 
News

Vision system automates analysis of bee activity for insight into biologically inspired robot design

Georgia Institute Of Technology : 09 December, 2003  (New Product)
The animal movement analysis system is part of the BioTracking Project, an effort conducted by Georgia Institute of Technology robotics researchers led by Tucker Balch, an assistant professor of computing. 'We believe the language of behavior is common between robots and animals,' Balch said. 'That means, potentially, that we could videotape ants for a long period of time, learn their 'program' and run it on a robot.'
Social insects, such as ants and bees, represent the existence of successful large-scale, robust behavior forged from the interaction of many, simple individuals, Balch explained. Such behavior can offer ideas on how to organize a cooperating colony of robots capable of complex operations.

To expedite the understanding of such behavior, Balch's team developed a computer vision system that automates analysis of animal movement, once an arduous and time-consuming task. Researchers are using the system to analyze data on the sequential movements that encode information, for example in bees, the location of distant food sources, Balch said. He will present the research at the Second International Workshop on the Mathematics and Algorithms of Social Insects on Dec. 16-17 at Georgia Tech.

With an 81.5 percent accuracy rate, the system can automatically analyze bee movements and label them based on examples provided by human experts. This level of labeling accuracy is high enough to allow researchers to build a subsequent system to accurately determine the behavior of a bee from its sequence of motions, Balch explained.

For example, one sequence of motions bees commonly perform are waggle dances consisting of arcing to the right, waggling (walking in a generally straight line while oscillating left and right), arcing to the left, waggling and so on. These motions encode the locations of distant food sources, according to Cornell University Professor of Biology Thomas Seeley, who has collaborated with Balch on this project. Balch is also working with Professor Deborah Gordon of Stanford University on related work with ants.

Balch's animal movement analysis system has several components. First, researchers shoot 15 minutes of videotape of bees, some of which are marked with a bright-colored paint and returned to an observation hive. Then computer vision-based tracking software converts the video of the marked bees into x- and y-coordinate location information for each animal in each frame of the footage. Some segments of this data are hand labeled by a researcher and then used as motion examples for the automated analysis system.

In future work, Balch and his colleagues will build a system that can learn executable models of these behaviors and then run the models in simulation. These simulations, Balch explained, would reveal the accuracy of the models. Researchers don't yet know if these models will yield better computer programming algorithms, though they are hopeful based on what previous research has revealed.
Bookmark and Share
 
Home I Editor's Blog I News by Zone I News by Date I News by Category I Special Reports I Directory I Events I Advertise I Submit Your News I About Us I Guides
 
   © 2012 NewMaterials.com
Netgains Logo