Decoding Drosophila photo-taxis and odor-taxis using natural and optogenetic stimuli



Marc H. Gershow

Department of Physics, New York University

 

 

 

We use the Drosophila larva as a model to understand how brains make decisions on the basis of temporally varying multi-sensory input. The larva's navigational behavior is ideal for study because the stimulus inputs are controllable and the motor outputs quantifiable. Whether responding to light, heat, or gaseous cues, the larva uses a biased random walk strategy that involves modulating the frequency, size, and direction of reorienting turns between forward runs.

To understand the exact transformations from sensory input to motor output underlying navigation, we identified computations made by Drosophila larvae responding to visual and optogenetically induced fictive olfactory stimuli. We modeled the larva's decision to initiate turns as the output of a Linear-Nonlinear-Poisson cascade. We used reverse-correlation to fit parameters to this model; the parameterized model predicted larvae's responses to novel stimulus patterns. We then described the computations the larva uses to bias turn size and direction.

How does the larva integrate information arriving from distinct sensory organs? We provided larvae with superposed independent randomly varying light and fictive odor signals and again used reverse correlation to fit model parameters. We found that larvae linearly combine olfactory and visual signals upstream of the decision to turn. We verified this prediction by measuring larvae's responses to coordinated changes in odor and light. We studied other navigational decisions and found that larvae integrated odor and light according to the same rule in all cases. These results suggest that photo-taxis and odor-taxis are mediated by a shared computational pathway. The methods we developed are broadly applicable to understanding how perturbations in activity in any set of genetically labeled neurons are interpreted behaviorally.


Accessibility