Upcoming Events!



Abstract: Since the discovery of rapid eye movement (REM) sleep, the nature of the eye movements that characterize this sleep phase has remained elusive. Do they reveal gaze shifts in the virtual environment of dreams or simply reflect random brainstem activity? We harnessed the head direction (HD) system of the mouse thalamus, a neuronal population whose activity reports, in awake mice, their actual HD as they explore their environment and, in sleeping mice, their virtual HD. We discovered that the direction and amplitude of rapid eye movements during REM sleep reveal the direction and amplitude of the ongoing changes in virtual HD. Thus, rapid eye movements disclose gaze shifts in the virtual world of REM sleep, thereby providing a window into the cognitive processes of the sleeping brain.Such coordination between eye movements and virtual HD suggests the involvement of the deep layers of the superior colliculus (dSC), a midbrain motor command center that drives eye and head movement to shift gazes during awake navigation. Here, we show that the dSC issues motor commands during REM sleep, e.g., turn left, that are similar to those issued in the awake behaving animal. Strikingly, these motor commands, despite not being executed, shift the internal representation of HD as if the animal had turned. Thus, during REM sleep, the brain simulates actions by issuing motor commands that, while not executed, have consequences as if they had been. These studies suggest that the sleeping brain, while disengaged from the external world, uses its internal model of the world to simulate interactions with it.
Past Events

Abstract: From serving a volleyball to playing the piano - one of our brain’s most remarkable feats...
Abstract: From serving a volleyball to playing the piano - one of our brain’s most remarkable feats is the ability to learn a sheer endless number of motor skills. Despite their importance, how our brain learns and generates such skills is poorly understood. While many nodes of the brain’s distributed motor network have been identified, their functions and interactions remain often unclear. We probe this network through the lens of complex, highly stereotyped and spatiotemporally precise movement patterns trained in rats. We have found that the basal ganglia play critical, unexpected roles in both skill learning and execution, and regulate the transition from variable to stereotyped movement patterns throughout learning. Furthermore, we are exploring how the brain solves the challenge to form, store and recall the memories for our countless skills, using the same neural substrates. Together, our results shine new light on the mechanisms and circuits underlying our motor skills.

Abstract
Around 380 million years ago, as vertebrates ventured onto land, vision changed dramatically...
Abstract
Around 380 million years ago, as vertebrates ventured onto land, vision changed dramatically. Air is far more transparent than water, and at the water-to-land transition we also see a marked increase in eye size. Together these factors yielded an enormous extension of visual range, resulting in a million-fold expansion in the volume of space that could be visually monitored. With objects detectable much farther away, animals suddenly had more time to act.
I argue that this elongated sensory horizon shifted the advantage from fast, reflexive responses---effective underwater when threats emerge at about a body length---to multi-step action sequences that are planned ahead. In particular, partially cluttered terrestrial settings (e.g., savanna-like mixes of open zones and cover) create many viable future paths, some of which will avoid mortal threat while others will lead to death. In such environments, selecting among imagined futures---planning---should pay off.
I will first present motivating simulation results showing that planning yields large benefits specifically in mid-clutter terrestrial regimes, whereas in very simple or very cluttered spaces habit-based action control is equally or more effective. To test these predictions behaviorally, we've built a robot–rodent interaction arena with reconfigurable obstacles that let us dial spatial complexity up or down. An autonomous robot acts as a mobile threat while mice navigate toward safety. I will share initial behavioral findings from this paradigm, including path diversification and pauses that appear to support look-ahead, as well as preliminary hippocampal recordings acquired during behavior. We also show some initial work comparing state-of-the-art reinforcement learning algorithms to animal behavior with interesting implications for improving AI. Together, these results outline a tractable experimental program for linking expanded terrestrial sensory horizons to the emergence of planning---and, potentially, to key components of mind.

Abstract: “A central goal shared by neuroscience and robotics is to understand how systems can...
Abstract: “A central goal shared by neuroscience and robotics is to understand how systems can navigate and act autonomously in complex environments. Although extensive research has revealed how the visual system segments natural scenes into distinct components—insights that have inspired advances in computer vision and robotics—the next crucial challenge remains: learning the properties of these objects and responding appropriately. In this talk, I will present our work using the fruit fly Drosophila melanogaster to investigate how the brain learns about objects and other animals in its environment, and how it uses that information to guide behavior. By integrating quantitative behavioral analysis, genetic manipulation, connectomics, and neural recordings, we aim to uncover the neural mechanisms that enable flexible, adaptive interactions with the world.”

The ION Retreat is scheduled for Saturday September 27 at Noon (including lunch and dinner with...
The ION Retreat is scheduled for Saturday September 27 at Noon (including lunch and dinner with guest speaker Anna Gillespie speaking at after dinner. The retreat continues on Sunday September 28 from 9am to noon with brunch.
RSVP for the retreat from the ION mailing list announcement. Contact host Matt Smear for more details.

This is a reschedule remote seminar for Eugenia's seminar from May 8. Contact ionseminars@uoregon...
This is a reschedule remote seminar for Eugenia's seminar from May 8. Contact ionseminars@uoregon.edu to receive the zoom link if not on regular mailing lists with your uoregon email address.



This visit is postponed for 5/29 and we hope to reschedule in the 2025-26 academic year.
Abstract -...
This visit is postponed for 5/29 and we hope to reschedule in the 2025-26 academic year.
Abstract - The superior colliculus (SC) is an evolutionarily conserved structure that receives direct retinal input in all vertebrates. It was the most sophisticated visual center until the neocortex evolved in mammals. Even in mice and tree shrews, mammalian species that are increasingly used in vision research, the vast majority of retinal ganglion cells project to the SC, making it a prominent visual structure in these animals. In this talk, I will review our recent functional studies of the mouse SC and describe our current efforts in linking functional properties to genetically identified cell types in both mice and tree shrews.