I am the Group Leader for the Speech Communication Neuroscience Group, at the ICN. You can contact me by phone on 020 76791144 or by email as Sophie Scott at ucl ac uk - you can work out where the dots should go, I am attempting to avoid junk email as far as possible.
If you want the notes from the Neurology Course at City University, CNS notes are here, functional imaging techniques are here, aphasia is here , TBI and dementia are here , right hemisphere functions and recovery from brain damage are here, and children's problems are here, and the notes on neurological signs and symptoms are here. This file contains overview notes for the whole course (except for the lecture on children).
I used to have blond hair. I go running with Team Brooke-Taylor - see us attempt the Totton 10k.
My CV is here. Go on, spoil yourself. Alternatively, check out your Wu Name.
The Onion rules. As does the Periodic Table of Rejected Elements and the Alanis Morrisette Lyrics Generator.
Most pop star's sites seem to be written by crazed fans. Not so the site of Momus.
I study the neural basis of human speech processing. I am particularly interested in how this relates to subsystems in human auditory cortex, analogies with communication by non-human primates, and the role of sensori-motor interfaces in speech perception. See and hear an example of some of our stimuli here. I'm also studying the role of the right temporal lobe in speech perception, whether is it preferentially interested in sequences with more prosodic/intonational cues, or speaker identity cues, or both. I'm collaborating with Dr. Richard Wise at the MRC Clinical Sciences Centre, Hammersmith Hospital and Prof. Stuart Rosen at UCL Dept. of Phonetics. Charvy Narain is working on the project and is based here and at FMRIB.
I am also involved in a rehabilitation study of reading after stroke; this involves both PET and fMRI (at the MR Unit, Institute of Psychiatry). It enabled me to check whether my brother has a brain - see it here.
Functional imaging can be very hard work.
I worked with Phil Barnard at the then MRC APU (now CBU) on modelling and specifying aspects of executive function, with respect to determining the integration of representations and processes. We extend our analyses to clinical situations, and the role of these processes in cognition and emotion.
Along with Andy Calder at the MRC CBU, I have investigated the role of emotion in speech and non-speech sounds. We have worked with people who experience problems in perceiving such vocal expressions after brain injury. We are also now investigating Paul Ekman's suggestion that there are more than 6 basic expressions of emotion, and that these are carried by the voice.
For an acoustic signal to have a rhythm, or a temporal structure, it must be made up of auditory events. These events have a perceptual moment of occurrence - a perceptual centre. That was my PhD research, that was. If you want to know more about this, or any other aspects of my work, do mail me.
Here are some examples of non-verbal emotional stimuli of Achievement, Amusement, Anger, Contentment, Disgust, Fear, Pleasure, Relief, Sadness and Surprise.