Upcoming ION Seminars

Abstract: “A central goal shared by neuroscience and robotics is to understand how systems can navigate and act autonomously in complex environments. Although extensive research has revealed how the visual system segments natural scenes into distinct components—insights that have inspired advances in computer vision and robotics—the next crucial challenge remains: learning the properties of these objects and responding appropriately. In this talk, I will present our work using the fruit fly Drosophila melanogaster to investigate how the brain learns about objects and other animals in its environment, and how it uses that information to guide behavior. By integrating quantitative behavioral analysis, genetic manipulation, connectomics, and neural recordings, we aim to uncover the neural mechanisms that enable flexible, adaptive interactions with the world.”

Abstract
Around 380 million years ago, as vertebrates ventured onto land, vision changed dramatically. Air is far more transparent than water, and at the water-to-land transition we also see a marked increase in eye size. Together these factors yielded an enormous extension of visual range, resulting in a million-fold expansion in the volume of space that could be visually monitored. With objects detectable much farther away, animals suddenly had more time to act.
I argue that this elongated sensory horizon shifted the advantage from fast, reflexive responses---effective underwater when threats emerge at about a body length---to multi-step action sequences that are planned ahead. In particular, partially cluttered terrestrial settings (e.g., savanna-like mixes of open zones and cover) create many viable future paths, some of which will avoid mortal threat while others will lead to death. In such environments, selecting among imagined futures---planning---should pay off.
I will first present motivating simulation results showing that planning yields large benefits specifically in mid-clutter terrestrial regimes, whereas in very simple or very cluttered spaces habit-based action control is equally or more effective. To test these predictions behaviorally, we've built a robot–rodent interaction arena with reconfigurable obstacles that let us dial spatial complexity up or down. An autonomous robot acts as a mobile threat while mice navigate toward safety. I will share initial behavioral findings from this paradigm, including path diversification and pauses that appear to support look-ahead, as well as preliminary hippocampal recordings acquired during behavior. We also show some initial work comparing state-of-the-art reinforcement learning algorithms to animal behavior with interesting implications for improving AI. Together, these results outline a tractable experimental program for linking expanded terrestrial sensory horizons to the emergence of planning---and, potentially, to key components of mind.
This academic year will host a series of virtual and in person seminars with live, remote access via Zoom. ION Seminars are open to the University of Oregon community and in person attendance is welcome. In person seminars will be held in Willamette 110 at 4 PM PT.
To accommodate remote speakers and time differences, some seminars may be offered at Noon PT or another agreed upon time. For students taking BI 407/507 Neuroscience Seminar please contact the course instructor to access recordings as needed.
Details for upcoming seminars will be shared here on the ION website as well as through our ION mailing lists. Links for remote access via Zoom will be available only through ION Seminar mailing list and those not on the list can request access by contacting Jenna Penny with their uoregon.edu email address.