Tag Archives: sight

Cooperation of sight and sound

As a child you were probably taught to tell how far away lightening was. When there is a flash, you count with a particular rhythm until you hear the thunder and that is how many miles the lightening is away from you. Parents are not going to stop teaching this because it is something for a nervous child to do in a thunder storm and it convinces them that they are usually a safe distance from danger. But it only works for distant events.

Events that are close by are synchronized by the brain and consciously we collapse the vision and hearing clues both for time and space to make a single event. We are not conscious of a difference in the timing or of any slight difference in the placing of the event. A particular region of the brain does this aligning - “the superior colliculus, a midbrain region that functions imperatively for integrating auditory and visual signals for attending to and localizing audiovisual stimuli”. But if the difference is too large between the vision and hearing, the collapse into a single event does not happen.

However, we know that, even though it is not consciously experienced, the information about small differences in sound arrival can be used by blind humans to echo-locate by making continuous little clicking noises. Could it be that the discrepancy between sound and sight could be used in other ways? A recent paper (Jaekl P, Seidlitz J, Harris LR, Tadin D (2015) Audiovisual Delay as a Novel Cue to Visual Distance. PLoS ONE 10(10): e0141125. doi:10.1371/journal.pone.0141125) studies the effect of sound delays on the perception of distance. Like the lightening calculation, but it is done unconsciously.

Here is the abstract:

For audiovisual sensory events, sound arrives with a delay relative to light that increases with event distance. It is unknown, however, whether humans can use these ubiquitous sound delays as an information source for distance computation. Here, we tested the hypothesis that audiovisual delays can both bias and improve human perceptual distance discrimination, such that visual stimuli paired with auditory delays are perceived as more distant and are thereby an ordinal distance cue. In two experiments, participants judged the relative distance of two repetitively displayed three-dimensional dot clusters, both presented with sounds of varying delays. In the first experiment, dot clusters presented with a sound delay were judged to be more distant than dot clusters paired with equivalent sound leads. In the second experiment, we confirmed that the presence of a sound delay was sufficient to cause stimuli to appear as more distant. Additionally, we found that ecologically congruent pairing of more distant events with a sound delay resulted in an increase in the precision of distance judgments. A control experiment determined that the sound delay duration influencing these distance judgments was not detectable, thereby eliminating decision-level influence. In sum, we present evidence that audiovisual delays can be an ordinal cue to visual distance.

Seeing clearly

Why do we not notice the limitations of our eyes and any time lag in perception? A recent paper by A. Herwig which was reported in ScienceDaily (here) looks at the mechanics of vision.

Only one portion of the retina has detailed vision, the fovea. If we hold our arm out, a bit about the size of a thumb nail, is seen clearly by the fovea. The rest of vision is not sharp. And yet we seem to have clear vision of a much larger area.

This paper puts forward a model that has the memory storing pairs of blurred and detailed images. When there is a blurred object in the visual field (but not in the fovea) it is replaced in the visual system by a detailed image of an object that fits the blurred image coming from the eyes. This is done so quickly that a person never observes the blurred object. These pairings of blurred and detailed objects are being continually updated.

The researchers used a very fast camera to follow a subject’s eye movements. During the extremely fast movements, saccades, from one fixed position to another, they changed the object that would be viewed. The subjects did not see the new object but rather the detailed pairing with the old blurred object.

The experiments show that our perception depends in large measure on stored visual experiences in our memory.” …these experiences serve to predict the effect of future actions (“What would the world look like after a further eye movement“). In other words: “We do not see the actual world, but our predictions.

This give us a clear visual picture that appears correct and immediate.

Here is the abstract (A. Herwig, W. Schneider; Predicting object features across saccades: Evidence from object recognition and visual search; Journal of Experimental Psychology: General (2014) 143-5)

When we move our eyes, we process objects in the visual field with different spatial resolution due to the nonhomogeneity of our visual system. In particular, peripheral objects are only coarsely represented, whereas they are represented with high acuity when foveated. To keep track of visual features of objects across eye movements, these changes in spatial resolution have to be taken into account. Here, we develop and test a new framework proposing a visual feature prediction mechanism based on past experience to deal with changes in spatial resolution accompanying saccadic eye movements. In 3 experiments, we first exposed participants to an altered visual stimulation where, unnoticed by participants, 1 object systematically changed visual features during saccades. Experiments 1 and 2 then demonstrate that feature prediction during peripheral object recognition is biased toward previously associated postsaccadic foveal input and that this effect is particularly associated with making saccades. Moreover, Experiment 3 shows that during visual search, feature prediction is biased toward previously associated presaccadic peripheral input. Together, these findings demonstrate that the visual system uses past experience to predict how peripheral objects will look in the fovea, and what foveal search templates should look like in the periphery. As such, they support our framework based on ideomotor theory and shed new light on the mystery of why we are most of the time unaware of acuity limitations in the periphery and of our ability to locate relevant objects in the periphery.