PS1061: Sensation and Perception 2014-15
Term 2,    Thursday 11 am - 1 pm    (Windsor Auditorium)

Lecture 10: Revision : Making sense of the world

Course co-ordinator: Johannes M. Zanker, j.zanker@rhul.ac.uk, (Room W 246)


Lecture Topics


comparing the senses

5 senses (proverbial, Aristotelian tradition)

sense
vision
hearing
touch
smell
taste
organ
eye
ear
skin
nose
tongue
physical quality of stimuli

light
electromagnetic wave

soud
pressure wave
deformation
physical objects
odour (scent)
airborne chemicals
aroma (flavour)
dissolved chemicals

Super(?)-human senses


why don’t we have X-ray vision ?

who has ?

the limitations of human sensory systems can best be demonstrated by looking at the ‘sensory ecology’ - in their particular habitats, animals have developed during evolution senses which are unknown to humans


The Ecology of Ultraviolet Vision in Birds & Insects

UV light (wavelength < 400nm) exposure can have immediate and/or long term effects on human eyes.
Photokeratitis : something like a painful sunburn on the front surface of the eyes (snow blindness)
many animals, however, use UV vision (and use protective screening mechanisms in the retina to protect against damage)

>> birds see the world in a very different way to humans !!! 

peacock feather in colour & UV

(from Bristol Ecology of Vision group)

the ability to detect UV light affects their behaviour
recognition of plumage patterns that is invisible to humans

this ability is used in communication,
in particular mating systems

 (Bennett et al. 1996)


in this context, various theories based on co-evolution are developed concerning animal and plant coloration (mimicry, fruit colours etc.)

similar abilities in many insects, which is important for them in the context of food selection and orientation:
for example, this honey bee is homing in on a colt's foot (Tussilago farfara L.) flower (which appears just as brigth yellow to the human eye);
the bee is guided through the prominent UV colour patterns to the centre of the flower head
the flower is usingthis pattern as advertising sign, because it needs the bee to transport its pollen
 

photographed in UV flash (c) Bjørn Rørslett

we find the same story underwater, where UV can help to counteract poor lighting conditions:

In normal daylight, these two species of Damsel fish, Pomacentrus amboinensis and Pomacentrus moluccensis, appear plain yellow, and are difficult to distinguish

In bright UV light, using a UV sensitive camera, and to any animal with UV vision, these two species show very distinct facial patterns

(by courtesy of Uli Siebeck, Brisbane)

some interesting adaptations to underwater vision can be found in humans, as well, but no UV sensitivty!


The Visual Ecology of Polarized Light

many animals, particularly invertebrates, are sensitive to the plane of polarized light (preferred orientation of electromagnetic waves)
– humans are completely blind to this property !
this provides these animals with an extra dimension of visual information: functional significance poorly understood


example : dungbeetles
living on firm coastal sands in burrows, the dung beetle returns to the burrow along the shortest possible route from extended foraging trips; in this environment visual landmarks are scarce

>> they rely on cues in the sky to find way back, using polarisation-sensitive photoreceptors in the dorsal parts of their eyes (see also Dacke et al. 1999)

what do humans do?  
humans use tools, such as sports sunglasses : advanced multi-layer lens technology in polarising sun glasses, eliminating surface glare and reducing unwanted reflections (fishing!)


Infrared Vision


 rattlesnakes and other pit vipers are visualising heat !
  •   heat-sensing pits behind each nostril 
  •   highly effective in detecting differences in   temperature at large distances (meters)
  •   crucial for hunting of warm-blooded   animals in the dark and as protection   against large  predators!
  •   integrated with visual images in the brain

(Newman & Hartline 1982)

 

and humans?
humas again need tools, for example: Thermal Imaging cameras that are often used to detect insulation, electrical and mechanical problems, as indicated by heat loss ...

(here: a thermal image of a dog)

  this technology is used in particular for night vision equipment !!!


Sensing magnetic fields

humans are also using tools to exploit the magnet field of the earth for navigation purposes :
compass (magnetic needle passively orienting parallel to magnetic field)
navigating birds have been demonstrated to use the earth magnetic field for orientation on their extended journeys
(thousands of miles)
little is known about the location and function of magnetic sensors & the strategies to exploit such signals

other suspects for the use of magnetic information:


Electric field

some fish can generate electric fields (electrogenic) and/or detect electric fields (electroreceptive)
which are used for active or passive orientation mechanisms: electrolocation


  • variety of location of the electric organ and of the waveform of the electric  organ discharge

(see Nelson & MacIver 1999)


Apteronotus albifrons (black ghost knifefish)

electrolocation is also known for the Australian Platypus



Echolocation in Humans

bats are known to use echo signals to fly in the dark >> humans?
a California teenager who is blind and overcomes his disability by experiencing life with all his other senses...


http://www.cbsnews.com/stories/2006/09/06/eveningnews/main1977730.shtml

 


Sensory processing strategies

>>> sensory integration

 

how can cross-modal combination of information be investigated experimentally ?

(an example for vision-touch integration: recognise objects by vision and/or haptics)

(large interest from engineering: sensory fusion in robots, VR, video games) >> how do nervous systems combine sensory information ?


Multisensory regions in the human brain

how and where are the different senses integrated in the human brain ?

fMRI is the tool to study some cases of multisensory integration: most prominent :
combination of visual and auditory information (Calvert et al 1998)

these could be the regions for crossmodal identification !


Audio-visual speech integration

one step further – conjunction: is coherence between visual AND auditory information crucial?

what happens in the multisensory brain regions when listeners receive conflicting signals (McGurk Effect) ? (seeing someone saying 'ga' while listening to sound 'ba' leads to perception of 'da')

(King and Calvert 2001)



Encoding of sign language

                example for sign language: butterfly                  

positron emission tomography (PET) scans in congenitally deaf people
(Nishimura et al 1999)

  • yellow: sign language >> upper regions of temporal lobe (encodes hearing, spoken language)
  • blue: vision (video of meaningless hand movements) >> early visual cortex
  • green: audition (cochlear implant) >> primary auditory cortex

additional evidence: PET brain scans of 11 profoundly deaf people and 10 hearing people (Petitto & Zatorre 2000)


what is the cortical location for language error handling ?
  • signed sentences mostly encoded in left hemisphere (like parsing spoken language hearing people)
  • identical brain structures for similar tasks in the left inferior frontal cortex in deaf and hearing people
  • meaningless grammatical hand movements >> greater blood flow in the planum temporale (like spoken equivalent in hearing people)


reading emotions

observers are asked to rate the sadness of a series of (real) faces with different expressions between happy and sad

sadness ratings follow a ‘psychometric curve’: from low to high for happy to sad faces

combining the face display with a sad or happy voice shifts the psychometric curve to higher or lower sadness ratings

emotional voices influence the categorization of facial expression !!
(De Gelder & Bertelson 2003)

for some more intersting observations about about facial expressions click here ...


transfer across sensory modalities

ventriloquism: speaking or uttering sounds so that they seem to come from the speaker’s dummy or some other source than the speaker

for some nice moves, click here for 'read my lips'


synaesthesia: a mixing of senses causing a person to experience such things as colored hearing, gustatory sights, and auditory smells…

(one in every 25,000 people !)

(Ramachandran & Hubbard 2003)


sensory substitution: replacing a (lost/missing) sense with some other sense!

the vOICe system translates arbitrary video images into sounds - this means that you can see with your ears, whenever you want to


an interesting case study (on YouTube)... (see also Stoffregen TA, Pittenger JB 1995)
Human echolocation : see a California teenager who is blind overcome his disability by experiencing life with all his other senses...


Theories of Perception


Gestalt psychology

Gestalt theory has its focus on the principles of perceptual organisation: ‘the whole is more than the sum of the parts’
‘laws’ describing perceptual phenomena <<you may want to ask yourself: are these explanations?>>
(Wertheimer, Koffka, Kohler)

  • praegnanz: good shape (vision, acoustical)
  • similarity
  • proximity
  • good continuation
  • familiarity

    (illusion from Kanizsa 1976)

constructivist approach

emphasizing top-down processes in perception: the mind tries to make the best sense of ambiguous data
(Neisser, Gregory)

  • active & constructive process
  • iterative comparison of sensory input with internal (stored) knowledge
  • hypotheses & expectations generate specific errors (illusions)
  • physiological basis ?
  • revival: Bayesian estimation

    ('Dalmatian' from Marr 1982)

direct perception

emphasizing bottom-up processing, exploiting richness of information content in sensory data
direct use of sensory input for behavioural control without need of high-level representation
(Gibson)

  • comprehensive capture of information in optic array
  • unambiguous information about spatial layout (flowfield)
  • affordances: meaning in behavioural context
  • resonance: process to extract information (filter tuning?)

    flowfield sketch: pilot approaching runway, from Gibson 1979


the information processing approach

in these lectures the focus was on a neuroscientific & computational approach to perception, which describes the first information processing steps as basis for cognitive psychology

common themes for all 5 major sensory channels are the following key concepts:

how does this approach relate to other concepts of perception?


summary: integration & conceptual framework


some things to remember about revision and exams


Reading:

Specific References:


to download a pdf copy of lecture slides, click here

back to course outline
last update 8-03-2015
Johannes M. Zanker