Circuits - Computation - Models
In our department, we are interested in how the brain computes. In particular, we want to understand neural information processing at the level of individual neurons and small neural circuits. As an example for neural computation, we study visual course control in the fruit fly Drosophila. This is a tractable system for the following reasons: It involves computations of modest complexity, these computations are implemented in circuits that contain a rather limited number of neurons (typically less than 100), and each of these neurons can be genetically targeted allowing manipulation and recording of its activity.
For a comprehensive picture of information processing in the visual system, we combine precise anatomical reconstructions of the neuronal elements with detailed characterizations of their physiological response properties and their functional role in behavior. Computational modeling allows us to confirm our findings in a theoretical framework and make predictions for future experiments. Finally, we use our knowledge to engineer artificial flying vehicles equipped with camera systems that implement fly-inspired motion detection algorithms.
Background and Techniques
Directional motion information is fundamental for survival and, thus, an essential component of visual computation. We are interested in the neural circuits that extract the direction of image motion locally from point in visual space.
How is the network of neurons organized across different layers in the visual system to establish the computations required for motion processing? We address this question using the Serial-Block-Face-Scanning-Electron–Microscopy technique (SBEM) to construct a functional circuit diagram (or connectome) that provides detailed three-dimensional (3D) knowledge of the neuronal networks involved.
We exploit Drosophila neurogenetic approaches in order to visualize and functionally manipulate specific visual circuit elements in a non-invasive fashion. In combination with suitable readouts, such as behavior, whole-cell patch-clamp recordings or calcium imaging, this approach enables us to study the functional role and connectivity of individual neuron types in the computation of interest in great detail.
We perform electrophysiological recordings from large motion sensitive neurons in the lobula plate of the fly while presenting visual stimuli to the eyes. By combining this technique with genetic silencing or optogenetic stimulation of upstream neural elements, we investigate the functional role of specific cell types in motion detection.
In the fly visual system, application of Calcium imaging using fluorescent, genetically encoded Calcium indicators is complicated by the artifactual stimulation of photoreceptors by the excitation light. We, therefore, use a 2-Photon microscope to confine the excitation light to a precise location along the Z-axis of the objective. With this technique, we study the visual responses of small, columnar neurons which are inaccessible to electrophysiological recordings.
The neural activity in the fly visual system represents the outside world and is eventually used to initiate an appropriate behavioral response. To precisely investigate how visual stimulation is mapped onto behavior, we place the flies in a virtual environment and measure their walking dynamics or track their wing movements.
Extracting information about motion represents one of the most fundamental computations carried out by the visual system. The canonical Hassenstein-Reichardt detector offers a remarkably powerful algorithmic solution. We study how biophysical circuits operating at the level of neurons and synapses implement such high-level models composed of purely mathematical operations.
We apply our knowledge about the fly visual course control system to the design of autonomously flying robots having many practical applications. On the other hand, such ‘miniature airborne vehicles’ or MAVs also allow us to test the function of many aspects of the fly neural circuits under more natural conditions.