Terry Takahashi

Professor, Department of Biology
Member & Co-Director, ION

Ph.D. State University of New York, Downstate Medical Center 
B.S. University of California, Irvine

Office:
224 Huestis
541-346-4544 

 

Research Interests: Coding and integration of sensory information; neuroethology

Overview: The goal of my research program is to understand how we get perception from physical stimuli. To address this broad question, my laboratory uses electrophysiological, anatomical, behavioral, and computational modeling methods to study spatial hearing in the barn owl, a nocturnal predator that can hunt solely by means of auditory cues. Not only is the barn owl a good auditory predator, but its midbrain contains a retina-like map of auditory space. The neurons that compose this map have spatial receptive fields, like visual neurons, and are arrayed topographically so that each neuron represents some point in auditory space. Using this auditory space map, my lab has been analyzing the auditory system's ability to resolve and image the sounds from a particular source in the presence of echoes and other competing sounds.

RECENT PUBLICATIONS

Related Articles

A Neural Model of Auditory Space Compatible with Human Perception under Simulated Echoic Conditions.

PLoS One. 2015;10(9):e0137900

Authors: Nelson BS, Donovan JM, Takahashi TT

Abstract
In a typical auditory scene, sounds from different sources and reflective surfaces summate in the ears, causing spatial cues to fluctuate. Prevailing hypotheses of how spatial locations may be encoded and represented across auditory neurons generally disregard these fluctuations and must therefore invoke additional mechanisms for detecting and representing them. Here, we consider a different hypothesis in which spatial perception corresponds to an intermediate or sub-maximal firing probability across spatially selective neurons within each hemisphere. The precedence or Haas effect presents an ideal opportunity for examining this hypothesis, since the temporal superposition of an acoustical reflection with sounds arriving directly from a source can cause otherwise stable cues to fluctuate. Our findings suggest that subjects' experiences may simply reflect the spatial cues that momentarily arise under various acoustical conditions and how these cues are represented. We further suggest that auditory objects may acquire "edges" under conditions when interaural time differences are broadly distributed.

PMID: 26355676 [PubMed - indexed for MEDLINE]

Related Articles

Spike timing precision changes with spike rate adaptation in the owl's auditory space map.

J Neurophysiol. 2015 Oct;114(4):2204-19

Authors: Keller CH, Takahashi TT

Abstract
Spike rate adaptation (SRA) is a continuing change of responsiveness to ongoing stimuli, which is ubiquitous across species and levels of sensory systems. Under SRA, auditory responses to constant stimuli change over time, relaxing toward a long-term rate often over multiple timescales. With more variable stimuli, SRA causes the dependence of spike rate on sound pressure level to shift toward the mean level of recent stimulus history. A model based on subtractive adaptation (Benda J, Hennig RM. J Comput Neurosci 24: 113-136, 2008) shows that changes in spike rate and level dependence are mechanistically linked. Space-specific neurons in the barn owl's midbrain, when recorded under ketamine-diazepam anesthesia, showed these classical characteristics of SRA, while at the same time exhibiting changes in spike timing precision. Abrupt level increases of sinusoidally amplitude-modulated (SAM) noise initially led to spiking at higher rates with lower temporal precision. Spike rate and precision relaxed toward their long-term values with a time course similar to SRA, results that were also replicated by the subtractive model. Stimuli whose amplitude modulations (AMs) were not synchronous across carrier frequency evoked spikes in response to stimulus envelopes of a particular shape, characterized by the spectrotemporal receptive field (STRF). Again, abrupt stimulus level changes initially disrupted the temporal precision of spiking, which then relaxed along with SRA. We suggest that shifts in latency associated with stimulus level changes may differ between carrier frequency bands and underlie decreased spike precision. Thus SRA is manifest not simply as a change in spike rate but also as a change in the temporal precision of spiking.

PMID: 26269555 [PubMed - indexed for MEDLINE]

Related Articles

The contributions of onset and offset echo delays to auditory spatial perception in human listeners.

J Acoust Soc Am. 2012 Dec;132(6):3912-24

Authors: Donovan JM, Nelson BS, Takahashi TT

Abstract
In echoic environments, direct sounds dominate perception even when followed by their reflections. As the delay between the direct (lead) source and the reflection (lag) increases, the reflection starts to become localizable. Although this phenomenon, which is part of the precedence effect, is typically studied with brief transients, leading and lagging sounds often overlap in time and are thus composed of three distinct segments: the "superposed" segment, when both sounds are present together, and the "lead-alone" and "lag-alone" segments, when leading and lagging sounds are present alone, respectively. Recently, it was shown that the barn owl (Tyto alba) localizes the lagging sound when the lag-alone segment, not the lead-alone segment, is lengthened. This was unexpected given the prevailing hypothesis that a leading sound may briefly desensitize the auditory system to sounds arriving later. The present study confirms this finding in humans under conditions that minimized the role of the superposed segment in the localization of either source. Just as lengthening the lag-alone segment caused the lagging sound to become more salient, lengthening the lead-alone segment caused the leading sound to become more salient. These results suggest that the neural representations of the lead and lag are independent of one another.

PMID: 23231121 [PubMed - indexed for MEDLINE]

Related Articles

The role of envelope shape in the localization of multiple sound sources and echoes in the barn owl.

J Neurophysiol. 2013 Feb;109(4):924-31

Authors: Baxter CS, Nelson BS, Takahashi TT

Abstract
Echoes and sounds of independent origin often obscure sounds of interest, but echoes can go undetected under natural listening conditions, a perception called the precedence effect. How does the auditory system distinguish between echoes and independent sources? To investigate, we presented two broadband noises to barn owls (Tyto alba) while varying the similarity of the sounds' envelopes. The carriers of the noises were identical except for a 2- or 3-ms delay. Their onsets and offsets were also synchronized. In owls, sound localization is guided by neural activity on a topographic map of auditory space. When there are two sources concomitantly emitting sounds with overlapping amplitude spectra, space map neurons discharge when the stimulus in their receptive field is louder than the one outside it and when the averaged amplitudes of both sounds are rising. A model incorporating these features calculated the strengths of the two sources' representations on the map (B. S. Nelson and T. T. Takahashi; Neuron 67: 643-655, 2010). The target localized by the owls could be predicted from the model's output. The model also explained why the echo is not localized at short delays: when envelopes are similar, peaks in the leading sound mask corresponding peaks in the echo, weakening the echo's space map representation. When the envelopes are dissimilar, there are few or no corresponding peaks, and the owl localizes whichever source is predicted by the model to be less masked. Thus the precedence effect in the owl is a by-product of a mechanism for representing multiple sound sources on its map.

PMID: 23175801 [PubMed - indexed for MEDLINE]

Related Articles

How the owl tracks its prey--II.

J Exp Biol. 2010 Oct 15;213(Pt 20):3399-408

Authors: Takahashi TT

Abstract
Barn owls can capture prey in pitch darkness or by diving into snow, while homing in on the sounds made by their prey. First, the neural mechanisms by which the barn owl localizes a single sound source in an otherwise quiet environment will be explained. The ideas developed for the single source case will then be expanded to environments in which there are multiple sound sources and echoes--environments that are challenging for humans with impaired hearing. Recent controversies regarding the mechanisms of sound localization will be discussed. Finally, the case in which both visual and auditory information are available to the owl will be considered.

PMID: 20889819 [PubMed - indexed for MEDLINE]

Related Articles

Spatial hearing in echoic environments: the role of the envelope in owls.

Neuron. 2010 Aug 26;67(4):643-55

Authors: Nelson BS, Takahashi TT

Abstract
In the precedence effect, sounds emanating directly from the source are localized preferentially over their reflections. Although most studies have focused on the delay between the onset of a sound and its echo, humans still experience the precedence effect when this onset delay is removed. We tested in barn owls the hypothesis that an ongoing delay, equivalent to the onset delay, is discernible from the envelope features of amplitude-modulated stimuli and may be sufficient to evoke this effect. With sound pairs having only envelope cues, owls localized direct sounds preferentially, and neurons in their auditory space-maps discharged more vigorously to them, but only if the sounds were amplitude modulated. Under conditions that yielded the precedence effect, acoustical features known to evoke neuronal discharges were more abundant in the envelopes of the direct sounds than of the echoes, suggesting that specialized neural mechanisms for echo suppression were unnecessary.

PMID: 20797540 [PubMed - indexed for MEDLINE]

Related Articles

Independence of echo-threshold and echo-delay in the barn owl.

PLoS One. 2008;3(10):e3598

Authors: Nelson BS, Takahashi TT

Abstract
Despite their prevalence in nature, echoes are not perceived as events separate from the sounds arriving directly from an active source, until the echo's delay is long. We measured the head-saccades of barn owls and the responses of neurons in their auditory space-maps while presenting a long duration noise-burst and a simulated echo. Under this paradigm, there were two possible stimulus segments that could potentially signal the location of the echo. One was at the onset of the echo; the other, after the offset of the direct (leading) sound, when only the echo was present. By lengthening the echo's duration, independently of its delay, spikes and saccades were evoked by the source of the echo even at delays that normally evoked saccades to only the direct source. An echo's location thus appears to be signaled by the neural response evoked after the offset of the direct sound.

PMID: 18974886 [PubMed - indexed for MEDLINE]

Related Articles

Object localization in cluttered acoustical environments.

Biol Cybern. 2008 Jun;98(6):579-86

Authors: Takahashi TT, Keller CH, Nelson BS, Spitzer MW, Bala AD, Whitchurch EA

Abstract
In nature, sounds from objects of interest arrive at the ears accompanied by sound waves from other actively emitting objects and by reflections off of nearby surfaces. Despite the fact that all of these waveforms sum at the eardrums, humans with normal hearing effortlessly segregate one sound source from another. Our laboratory is investigating the neural basis of this perceptual feat, often called the "cocktail party effect", using the barn owl as an animal model. The barn owl, renowned for its ability to localize sounds and its spatiotopic representation of auditory space, is an established model for spatial hearing. Here, we briefly review the neural basis of sound-localization of a single sound source in an anechoic environment and then generalize the ideas developed therein to cases in which there are multiple, concomitant sound sources and acoustical reflection.

PMID: 18491167 [PubMed - indexed for MEDLINE]