Our brain is constantly confronted with sensory information, yet it manages to filter out relevant bits to produce appropriate behavior. Our lab is interested in the neural computations that allow brains to process sensory information and drive behavior.
We approach this problem by studying acoustic communication in insects using computational tools.
- During acoustic communication for courtship the sender produces a temporally structured signal - a song. The receiver parses that song to learn about the species or quality of the sender and acts according to that information. We want to know how this interaction unfolds in the brain - how song is produced in the sender and processed in the receiver.
- Insects are ideal model systems to understand basic principles of neural computation: They have small, hard-wired brains which makes them simpler to understand. Yet, they are capable of sophisticated behaviors - including acoustic communication.
- Computational tools serve a two-fold role in the lab: First, as methods for analyzing large behavioral and neuronal data sets. And second, as a glue that brings together descriptions of the nervous system from multiple levels - genes & molecules, small neural networks, or the behavior of the whole organism.
Currently, we’re focused on the following questions:
- How are acoustic signals efficiently filtered and encoded in the sensory periphery?
- How does genetic variability within and across populations affect the neural processing of song?
- How do the neural networks that recognize song evolve?
- How is sensory information combined over time to inform behavior?
- How is information routed to motor circuits to drive state-dependent or sex-specific behavioral responses to the same sensory stimulus?
- How do acoustic signals shape group dynamics?