Wednesday 15 April 2015

Bee brain simulation used to pilot a drone

The team of researchers working on the The Green Brain Project has advanced to the point of being able to use what they’ve created in mimicking a honeybee brain, to actually pilot a flying robot drone—at least partially. The aim of the project is to completely reproduce a bee’s brain in digital form, allowing at some point, a flying robot to function as the real thing with onlookers none the wiser.


beebrainsimu


Despite its small size, and limited thinking abilities, the brain of the bee is still an incredibly complicated piece of biology. It allows the bee to see its environment, respond to it, fly around in it and to go about its bee activities, such as mating, pollinating and stinging those that interrupt its mission. The people running the Green Brain Project chose the bee because of its remarkable ability to do so much with so little. Thus far, the BBC reports, the team has managed to put together a rudimentary olfactory (sensory) and vision system and has begun to test what they’ve created with robots—all of which is based on such science as decision theory and neuroscience modeling and technology such as parallel computing and of course, robotics.






In their latest effort, the team has inserted their digital bee brain into the workings of an ordinary quad-copter drone, allowing the drone to fly some missions without assistance. In one case it was able to fly down a corridor without running into anything, in another it was able to recognize a checkerboard pattern on a corridor wall and to use it as a form of navigation. The bee brain is not running the rotors directly just yet, but other teams in other parts of the world are busily attempting to create a drone that both looks and acts like a real bee—once that happens, the digital brain can be set inside and the drone/bee lookalike can be put through trials until it one day makes its way through cropland pollinating as a natural bee, perhaps taking over for the real thing as the biological kind continue to suffer colony collapse.



About the Project


The development of an ‘artificial brain’ is one of the greatest challenges in artificial intelligence, and its success will have innumerable benefits in many and diverse fields from robotics to cognitive psychology. Most research effort is spent on modelling vertebrate brains. Yet, smaller brains can display comparable cognitive sophistication while being more experimentally accessible and amenable to modelling.


The ‘Green Brain Project’ combines computational neuroscience modelling, learning and decision theory, modern parallel computing methods, and robotics with data from state-of-the-art neurobiological experiments on cognition in the honeybee Apis mellifera. These various methodologies are used to build and deploy a modular model of the honeybee brain describing detection, classification, and learning in the olfactory and optic pathways as well as multi-sensory integration across these sensory modalities.


Project Goal

Simply put, the goal of the Green Brain Project is to create a robot that thinks, sense, and acts like a honeybee! This is done by creating a neuromimetic model and using it to control a flying robot.


Better Understanding of Cognitive Functions in Animals


It has been well established that the honeybee Apis mellifera has surprisingly advanced cognitive behaviours despite the relative simplicity of its brain when compared to vertebrates. These cognitively sophisticated behaviours are achieved despite the very limited size of the honeybee brain (on the order of 10^6 neurons). In comparison, even rats or mice have brains on the order of 10^8 neurons. The greatly reduced scale and the experimental accessibility of the honeybee brain makes thorough neurobiological understanding and subsequent biomimetic exploitation much more practical than with even the simplest vertebrate brain.






Modelling of the honeybee brain will focus on three brain regions:


> The system for olfactory sensing
> The system for visual sensing
> The mushroom bodies for multi-modal sensory integration


These systems are chosen as they exhibit complex cognitive behaviors essential to autonomous agents which are not currently understood.


Adaptive and Robust Algorithms for Autonomous, Unmanned Aircraft

Increasingly over the years, Unmanned Aerial Vehicles (UAVs) are being used to perform missions that are considered “dull, dirty, and dangerous”. Examples include:


>Nuclear Power Plant Missions
>Search and Rescue
>Weather Forecasting
>Pipeline Monitoring
>Agriculture Operations
>Wild-fire Surveillance
>Border/Road Patrols
>Exploratory Operations


As the trend develops toward the increasing use of UAVs, it becomes necessary to allocate and control them effectively. One of the biggest challenges in the area of UAVs is in developing algorithms that not only have good performance but are also adaptive and robust. This is essential for UAVs that need to perform in the real world which is ever changing and hardly certain. By extending existing models and techniques of honeybees, we plan to demonstrate sophisticated visual-based navigation and cognitive functions in a flying robotic platform.






– Credit and Resource –


The Green Brain Project

BBC



Bee brain simulation used to pilot a drone

No comments:

Post a Comment