Research

We study how animals produce, process, and act on acoustic signals. Our work combines behavioral experiments, neural recordings, computational modeling, and machine learning tools that make complex communication and navigation behavior measurable.

Audio AI / ML tools

Building tools for animal sound analysis.

We develop machine learning tools for detecting, segmenting, and quantifying animal sounds. DAS is used by labs studying flies, zebra finches, nightingales, parrots, gerbils, mice, bats, and chimpanzees, helping researchers turn large audio recordings into structured data for behavioral and neural analysis.

Neural basis of communication

Linking acoustic signals to neural computation and behavior.

We investigate how nervous systems encode multimodal social cues, extract behaviorally relevant features, and route sensory information to circuits that shape social decisions. Using genetic manipulations, optogenetics, and optical physiology, we link stimulus structure, neural responses, and behavior. To test how these computations arise, we build connectome-constrained network models, enabling direct comparison of candidate mechanisms with anatomy, neural activity, and behavior.

Evolution of acoustic communication

Comparing the circuits for acoustic communication across species.

Across species, communication systems evolve through changes in signals, receivers, and the neural networks that link them. We study how the networks that shape social interactions, song structure, and song recognition vary across species and how that variation can produce new acoustic communication systems.

Group behavior

From courtship interactions to collective neural data.

We study social behavior in groups to understand how communication is modulated across different social settings. We connect group behavior with genetic manipulations and neural data to ask how animals coordinate social interactions in the complex social environments of groups.

Animal navigation

Modeling navigation circuits for Navisense and robotics.

As part of Navisense, we work on models of the insect central complex and their application to robots. This project uses compact neural circuits to understand how animals orient and navigate, and to test the underlying principles in embodied systems.