Optimal foraging strategies using ambiguous cues (ongoing)

How do animals use information that may be an important, yet unreliable, cue for something they are searching for? To answer this question I am studying the behavioral responses of freely walking and flying fruit flies, Drosophila melanogaster, to a range of odors with different levels of specificity to fermenting fruit (a fruit flies primary food source and social gathering space). Thanks to genetic tools available in the fly, I am able to silence and activate individual neurons in the brain with high temporal and spatial specificity, making it possible to probe which parts of the brain are responsible for their behavioral choices.

Behavior is a complex subject; with over 16,000 fly-hours of detailed data collection with over 1,000 individual animals, there are still some unresolved questions. Stay tuned for results soon!

Diving Flies of Mono Lake (ongoing)

Alkali fly (Ephydra hians) underwater at Mono Lake, CA.

In late summer, the shores of Mono Lake, California, are bustling with small flies, Ephydra hydropyrus, which dive under water inside small air bubbles to feed. Despite Mark Twain’s charismatic description of them in his book Roughing It, we still do not understand how they are able to perform this entertaining and miraculous feat.

“You can hold them under water as long as you please–they do not mind it–they are only proud of it. When you let them go, they pop up to the surface as dry as a patent office report, and walk off as unconcernedly as if they had been educated especially with a view to affording instructive entertainment to man in that particular way.”

Using a combination of high speed videography, force measurements, scanning electron microscopy, and manipulations of water chemistry I am working towards understanding what makes these flies so unique. See this recent press article for a more detailed description: Fly makes air ‘submarine’ to survive deadly lake (Science, 2016).

.

 

Multi-modal sensory integration in Mosquitoes (2015)

Mosquitoes find human hosts through the iteration of several independent behavioral modules, which are linked together through environmental interactions.
Mosquitoes find human hosts through the iteration of several independent behavioral modules, which are linked together through environmental interactions.

To find human hosts, mosquitoes must integrate sensory cues that are separated in space and time. To solve this challenge my collaborators Michael Dickinson, Jeff Riffell, and Adrienne Fairhall and I showed that mosquitoes respond to exhaled CO2 by exploring visual features they otherwise ignore. This guides them to potential hosts, where they use cues such as heat and humidity to locate a landing site.

Coauthored with: Jeff Riffell, Adrienne Fairhall, Michael Dickinson. Read more about our work in Current Biology.

 

 

Animation: The above animations show a collection of 200-500 mosquito trajectories, each aligned to the moment when they last passed through a CO2 plume. Only trajectories that approach either the room temperature (blue), or the 37° C (orange), object are shown. Note that the mosquito trajectories were recorded at different times, and superimposed for presentation purposes. Only 20 mosquitoes were released into the (1.5x.3x.3 m^3) wind tunnel at a time, and rarely were there more than a few flying simultaneously – in flight interactions were rare. Note how the mosquitoes spend more time near the warm object.

Nonlinear Distance Estimation (2014)

Geometric relationship of the moving camera and a reference object.
Geometric relationship of the moving camera and a reference object.

Vision is arguably the most widely used sensor for position and velocity estimation in animals, and it is increasingly used in robotic systems as well. Many animals use stereopsis and object recognition in order to make a true estimate of distance. For a tiny insect such as a fruit fly or honeybee, however, these methods fall short. Instead, an insect must rely on calculations of optic flow, which can provide a measure of the ratio of velocity to distance, but not either parameter independently. Nevertheless, flies and other insects are adept at landing on a variety of substrates, a behavior that inherently requires some form of distance estimation in order to trigger distance-appropriate motor actions such as deceleration or leg extension. Previous studies have shown that these behaviors are indeed under visual control, raising the question: how does an insect estimate distance solely using optic flow? In this paper we use a nonlinear control theoretic approach to propose a solution for this problem. Our algorithm takes advantage of visually controlled landing trajectories that have been observed in flies and honeybees. Finally, we implement our algorithm, which we term dynamic peering, using a camera mounted to a linear stage to demonstrate its real-world feasibility. Read more in BioInspiration and BioMimetics.

Movie: Real time performance of the dynamic peering estimation algorithm. The video shows the data from figures 4 and 5 as an animation. Bottom left: Camera image sequence showing the visual target and region of interest (red box). Bottom right: Optic flow as a function of camera pixel calculated using the current and previous frames using OpenCV’s Lucas Kanade algorithm. For the purposes of control, we calculated a linear fit of the data (red line) over the region of interest. Top row: Dynamic peering performance (red) compared with the ground-truth values (blue) for position, velocity, and optic flow estimates, as well as the applied control effort. After an initial period where the robot accelerates to the steady state optic flow rate of -0.1 1/s, the estimates lock on to the actual values.

Neuromodulation of Flight Speed Regulation (2014)

Confocal image of a Drosophila brain and ventral nerve cord showing the GFP-labeled octopamine neurons (green), which we genetically silenced with Kir2.1. Red labels neuropil. Image taken by my colleague Marie Suver.
Confocal image of a Drosophila brain and ventral nerve cord showing the GFP-labeled octopamine neurons (green), which we genetically silenced with Kir2.1. Red labels neuropil. Image taken by my colleague Marie Suver.

Recent evidence suggests that flies’ sensitivity to large-field optic flow is increased by the release of octopamine during flight. This increase in gain presumably enhances visually mediated behaviors such as the active regulation of forward speed, a process that involves the comparison of a vision-based estimate of velocity with an internal set point. To determine where in the neural circuit this comparison is made, we selectively silenced the octopamine neurons in the fruit fly Drosophila, and examined the effect on vision-based velocity regulation in free-flying flies. We found that flies with inactivated octopamine neurons accelerated more slowly in response to visual motion than control flies, but maintained nearly the same baseline flight speed. Our results are parsimonious with a circuit architecture in which the internal control signal is injected into the visual motion pathway upstream of the interneuron network that estimates groundspeed.

Coauthored with Marie Suver, Michael Dickinson. Read more at the Journal of Experimental Biology.

Preferred visual motion set point is either modulated by changes in gain identical to those applied in the lobula plate tangential cell (LPTC) network (H2a), or the set point enters the visual sensory-motor cascade upstream of the LPTC network (H2b). (A) Block diagram showing the models under consideration. (B) Gain versus temporal frequency curve used for the low pass filter in the visual sensory system of our model. The data points are drawn from the temporal frequency tuning curve given in fig. 1D in Suver et al. (Suver et al., 2012), which summarizes the responses of electrophysiological recordings of vertical system (VS) cells in response to vertical motion. The original data were scaled such that the gain at a temporal frequency of 1 Hz is 1. The line shows a third-order polynomial fit. Note that this results in a transfer function defined in the linear temporal frequency domain, rather than the oscillatory temporal frequency domain. In order to implement this type of filter in our control model, we calculate the gain based on the linear temporal frequency of the stimulus. (C) Baseline subtracted membrane potential of VS cells in response to a downward 8 Hz visual motion stimulus during flight; data repeated, and magnified, from Suver et al. (Suver et al., 2012). The gray traces show the mean responses each of 19 individual flies, and the bold trace shows the group mean. (D) Model predictions compared with our results from Fig. 3E. The solid blue line shows the model prediction for the parental controls (gain=5.5) with the biomechanical saturation, whereas the dotted blue line shows the prediction without saturation. The solid red line shows the model prediction for the flies with inactivated octopamine neurons (gain=2.2). Note that the models H1, H2a and H2b all give identical acceleration responses. (E) Model predictions (color coded consistently with A) compared with mean velocity versus time responses for parental controls (left) and flies with inactivated octopamine neurons (right). The data traces are repeated from Fig. 3A. Note that H2 is a better fit.
Preferred visual motion set point is either modulated by changes in gain identical to those applied in the lobula plate tangential cell (LPTC) network (H2a), or the set point enters the visual sensory-motor cascade upstream of the LPTC network (H2b). (A) Block diagram showing the models under consideration. (B) Gain versus temporal frequency curve used for the low pass filter in the visual sensory system of our model. The data points are drawn from the temporal frequency tuning curve given in fig. 1D in Suver et al. (Suver et al., 2012), which summarizes the responses of electrophysiological recordings of vertical system (VS) cells in response to vertical motion. The original data were scaled such that the gain at a temporal frequency of 1 Hz is 1. The line shows a third-order polynomial fit. Note that this results in a transfer function defined in the linear temporal frequency domain, rather than the oscillatory temporal frequency domain. In order to implement this type of filter in our control model, we calculate the gain based on the linear temporal frequency of the stimulus. (C) Baseline subtracted membrane potential of VS cells in response to a downward 8 Hz visual motion stimulus during flight; data repeated, and magnified, from Suver et al. (Suver et al., 2012). The gray traces show the mean responses each of 19 individual flies, and the bold trace shows the group mean. (D) Model predictions compared with our results from Fig. 3E. The solid blue line shows the model prediction for the parental controls (gain=5.5) with the biomechanical saturation, whereas the dotted blue line shows the prediction without saturation. The solid red line shows the model prediction for the flies with inactivated octopamine neurons (gain=2.2). Note that the models H1, H2a and H2b all give identical acceleration responses.
(E) Model predictions (color coded consistently with A) compared with mean velocity versus time responses for parental controls (left) and flies with inactivated octopamine neurons (right). The data traces are repeated from Fig. 3A. Note that H2 is a better fit.

Odor plume tracking in Drosophila (2014)

A long exposure of a fruit fly illuminated by a shaft of light as the fly approaches a fermenting strawberry in a wind tunnel.
A long exposure of a fruit fly illuminated by a shaft of light as the fly approaches a fermenting strawberry in a wind tunnel.

Background: For a fruit fly, locating fermenting fruit where it can feed, find mates, and lay eggs is an essential and difficult task requiring the integration of olfactory and visual cues. Here, we develop an approach to correlate flies’ free-flight behavior with their olfactory experience under different wind and visual conditions, yielding new insight into plume tracking based on over 70 hr of data.

Results: To localize an odor source, flies exhibit three iterative, independent, reflex-driven behaviors, which remain constant through repeated encounters of the same stimulus: (1) 190 6 75 ms after encountering a plume, flies increase their flight speed and turn upwind, using visual cues to determine wind direction. Due to this substantial response delay, flies pass through the plume shortly after entering it. (2) 450 6 165 ms after losing the plume, flies initiate a series of vertical and horizontal casts, using visual cues to maintain a crosswind heading. (3) After sensing an attractive odor, flies exhibit an enhanced attraction to small visual features, which increases their probability of finding the plume’s source.

Conclusions: Due to plume structure and sensory-motor delays, a fly’s olfactory experience during foraging flights consists of short bursts of odor stimulation. As a con- sequence, delays in the onset of crosswind casting and the increased attraction to visual features are necessary behav- ioral components for efficiently locating an odor source. Our results provide a quantitative behavioral background for eluci- dating the neural basis of plume tracking using genetic and physiological approaches.

Coauthored with Michael Dickinson. Read more about my work in Current Biology.

Visual control of landing in Drosophila (2012)

A Landing Fly.
A Landing Fly.

Landing behavior is one of the most critical, yet least studied, aspects of insect flight. In order to land safely, an insect must recognize a visual feature, navigate towards it, decelerate, and extend its legs in preparation for touchdown. Although previous studies have focused on the visual stimuli that trigger these different components, the complete sequence has not been systematically studied in a free-flying animal. Using a real-time 3D tracking system in conjunction with high speed digital imaging, we were able to capture the landing sequences of fruit flies (Drosophila melanogaster) from the moment they first steered toward a visual target, to the point of touchdown. This analysis was made possible by a custom-built feedback system that actively maintained the fly in the focus of the high speed camera. The results suggest that landing is composed of three distinct behavioral modules. First, a fly actively turns towards a stationary target via a directed body saccade. Next, it begins to decelerate at a point determined by both the size of the visual target and its rate of expansion on the retina. Finally, the fly extends its legs when the visual target reaches a threshold retinal size of approximately 60deg. Our data also let us compare landing sequences with flight trajectories that, although initially directed toward a visual target, did not result in landing. In these ʻfly-byʼ trajectories, flies steer toward the target but then exhibit a targeted aversive saccade when the target subtends a retinal size of approximately 33deg. Collectively, the results provide insight into the organization of sensorimotor modules that underlie the landing and search behaviors of insects.

Coauthored with Michael Dickinson. Read more at the Journal of Experimental Biology.

High speed video: A fruit fly (Drosophila melanogaster) approaches and lands on a vertical post, filmed at 5,000 frames per second. To keep the fly in focus I built a custom feed-forward focus control system for the camera, which used 3D information from a realtime computer vision based tracking system.