• Türkçe
  • English
  • Funda Yildirim
  • Didem Katircilar

Summary:
Information coming from our different senses such as sight, hearing or touch is processed together by various mechanisms in the human brain to form “sensory perception” and this is called “multi-sensory integration”. Many experiences we have in our daily life are formed by the combination of information from more than one sensory modality. For example, we perceive a person that we see talking and also listen to what he says, by combining the information coming from our senses of sight and hearing. This information also guides our process of sensing and perceiving various features. For example, the temporal coincidences, spatial coincidences or semantic coincidences of various information play an important role in our perception of them. Also, the intensity of information from one sensory modality can be increased by combining information from other sensory modalities. In addition to improving our perception, this may also improve our localization and response to stimuli. Accurate identification of the sound source is a very important skill for people to guide their actions and decisions, and it is one of the first steps to act appropriately in response to perceived sound.

In this study, the effects of auditory, visual and audio-visual exercise types on sound localization performance will be examined. In the literature, the extent to which information from different senses affects sound localization performance is considerably less compared to sound source localization studies that consider variables such as azimuth and time difference between ears. For this reason, it is anticipated that this study will bring a new perspective to the literature and the subject.

START: 2021
FINISH: 2021
PROJECT NO: 121K715
INSTITUTION: TUBITAK