Auditory processing in a complex spatial environment
When there is a single, stationary object in the auditory environment, it activates space-selective neurons in the brain, which in turn direct orienting movements towards the object. However, the natural auditory environment is typically complex. Auditory objects can move through space, multiple objects can occur simultaneously, and auditory information must be integrated appropriately with the corresponding information from other sensory modalities to create multimodal representations of objects.
This complexity provides challenges that the brain most overcome in order to localize sounds correctly. For instance, when a sound moves through space, neural activity must predict the sound's future location in order to compensate for sensorimotor delays involved in auditory orienting behavior. In the second chapter of this thesis, I describe a neural circuit that can perform this computation. Additionally, when multiple sounds occur simultaneously in the environment, the animal must decide whether or not to group them perceptually as an object and, if they are grouped, the animal must decide where to localize them. In the third chapter of this thesis, I describe the rules by which multiple simultaneous sounds are localized, as well as the neural processing underlying this behavior. Finally, when an animal is faced with conflicting localization information from the auditory and visual modalities, it must employ learning rules that can appropriately resolve the conflict and reinstate crossmodal alignment. In the last chapter of this thesis, I describe a computational learning rule that can implement appropriate crossmodal learning.