Everything is relative: How flies see the world

The neural code in the fly brain continuously adapts to environmental conditions

Our visual system is extremely good at recognizing objects under the most diverse conditions. For example, we can detect people on the side of the road in bright sunlight as well as on cloudy days. We can also see the direction they are moving in, no matter if they are walking in front of a white wall or a crowded bus stop. However, what the eye and the brain seem to do with apparent ease is a great challenge for automated systems with computer-aided image processing. Max Planck neurobiologists have now discovered how the fly brain tackles this problem: The neurons constantly change their sensitivity depending on the contrast of the current environment. A comparison between neighboring cells then reveals an optimum that allows for the best possible transmission of visual information.

A fly brain consists of about 100,000 nerve cells, of which approximately 25,000 are involved in the perception of motion. Compared to brains of vertebrates, the number is rather small. Nevertheless, many parallels have been found between visual systems of such distinct animals as flies and, for example, mice. The great advantage of a fly brain is, however, that neurobiologists are able to decipher the system cell by cell.

Yet, what do individual neurons in the fly brain react to? To investigate this, researchers from Alexander Borst's department at the Max Planck Institute of Neurobiology built a panoramic cinema for fruit flies. While "films" are being shown there, neurobiologists record the activity of the neurons in the fly brain. Thanks to such investigations, motion perception of flies is one of the best understood neuronal circuits at the cellular level to date.

Still, computer models of fly motion vision have so far failed to reliably predict responses when photorealistic images of natural environments are used instead of artificial stripe patterns. Natural environments are comprised of a multitude of distinct objects that can vary considerably in brightness and contrast. This natural complexity presents great challenges for computer models.

To better understand how the fly brain enables the animal to cope with complex natural environments, Michael Drews and Aljoscha Leonhardt used a wide range of modern neurobiological methods, ranging from electrophysiology, imaging methods, and behavioral studies to model analysis using artificial intelligence.

Information processing is teamwork

As an important part of their investigation, the researchers presented the flies with rotating movies of landscapes at varying contrasts. Thanks to an innate behavior, flies react to the optic flow of such images by following with a rotation of their own, matching the speed and the direction of the scene. "By observing the rotation of the flies, we can see how well they can resolve the movement and the speed of the surrounding image," explains Drews. "Using this information, we were able to investigate the reactions of individual neurons to different contrast ratios."

The investigation demonstrated that the fly brain has a built-in feedback loop for contrast comparison, implemented right at the beginning of light stimulus processing. If a neuron perceives a high contrast, it first compares this value with the value from its neighboring cells. If the ambient contrast is low in comparison to the center contrast, the nerve cell responds strongly. If the ambient contrast is higher, the response of the cell is weaker.

The fly visual system thus encodes contrast only in relation to the ambient contrast. "Through this mechanism, the visual system constantly adapts its contrast sensitivity to the given environmental contrast," explains Leonhardt. "This results in a robust information transmission that works equally well under almost all conditions."

Artificial intelligence learns to see

In order to test the function of the new circuit, the researchers recreated the visual system in a computer model – once with and once without the feedback loop. Out of these artificial neural networks, those that learned to “see” with the extended circuit were able to react much better than the ones trained with the simple circuit. Crucially, the extended circuit also coped well with natural environmental images.

The scientists have thus found a very simple yet effective algorithm that allows the fly to compute movement under varying contrast conditions. Similar circuits are suspected to also play a role in the brain of mice. The fly can therefore help us to better understand the brains of other animals or perhaps make artificial and computer-assisted vision systems even more efficient.

Other Interesting Articles

Go to Editor View