Abstract: Understanding how the brain gives rise to behavior is a central question in neuroscience, but this endeavor is hampered by the difficulty of quantitatively measuring and modeling complex behavior. This problem is especially pronounced in studies of social behavior using animal models which often employ freely-moving and naturalistic behavioral paradigms. In this talk, I will highlight some of our recent work in building computational tools driven by deep learning and computer vision for robust motion capture of socially-interacting animals, including flies, bees, mice, marmosets and humans. The postural dynamics captured by these technologies enable highly quantitative behavioral analyses at unprecedented resolution; towards the end of the talk I will outline the efforts we're currently undertaking to use these types of data to model the brain, as well as how we've been applying these tools to other domains such as plant biology. <a href="https://urldefense.com/v3/__https://talmolab.org/__;!!C5qS4YX3!UFiWhPHO… more</a>|full_html