scholarly article | Q13442814 |
P50 | author | Marc Sato | Q115599483 |
P2093 | author name string | Coriandre Vilain | |
Avril Treille | |||
P2860 | cites work | Neural correlates of multisensory integration of ecologically valid audiovisual events | Q81345936 |
Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions | Q87297473 | ||
EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis | Q29547481 | ||
Hearing lips and seeing voices | Q29619186 | ||
Speech through ears and eyes: interfacing the senses with the supramodal brain | Q30452048 | ||
The natural statistics of audiovisual speech | Q30488184 | ||
Bimodal speech: early suppressive visual effects in human auditory cortex | Q30498576 | ||
Visual speech speeds up the neural processing of auditory speech | Q30499596 | ||
The N1 wave of the human electric and magnetic response to sound: a review and an analysis of the component structure | Q34185761 | ||
Listening with eye and hand: Cross-modal contributions to speech perception | Q34762575 | ||
Cortical oscillations and sensory predictions. | Q38017454 | ||
Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integration | Q38453403 | ||
Effects of phonetic context on audio-visual intelligibility of French. | Q40570808 | ||
Tactile enhancement of auditory and visual speech perception in untrained perceivers. | Q43134903 | ||
Perception of visible speech: influence of spatial quantization | Q43684160 | ||
Temporal window of integration in auditory-visual speech perception | Q44437600 | ||
Analytic study of the Tadoma method: background and preliminary results | Q44796227 | ||
Auditory-tactile speech perception in congenitally blind and sighted adults | Q47260428 | ||
Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception | Q48124103 | ||
Evidence of a visual-to-auditory cross-modal sensory gating phenomenon as reflected by the human P50 event-related brain potential modulation | Q48334765 | ||
Sequential audiovisual interactions during speech perception: a whole-head MEG study | Q48386751 | ||
Dual neural routing of visual facilitation in speech processing. | Q48422204 | ||
Auditory event-related potentials (ERPs) in audiovisual speech perception | Q48500065 | ||
Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuli | Q48533817 | ||
Seeing speech: visual information from lip movements modifies activity in the human auditory cortex | Q48707959 | ||
Electrophysiological evidence for speech-specific audiovisual integration. | Q48832369 | ||
Does audiovisual speech offer a fountain of youth for old ears? An event-related brain potential study of age differences in audiovisual speech perception | Q50434499 | ||
Hearing lips in a second language: visual articulatory information enables the perception of second language sounds. | Q50468567 | ||
Seeing to hear better: evidence for early audio-visual interactions in speech identification. | Q52089490 | ||
Visual Contribution to Speech Intelligibility in Noise | Q56225354 | ||
The Perception-for-Action-Control Theory (PACT): A perceptuo-motor theory of speech perception | Q57461602 | ||
Evoked dipole source potentials of the human auditory cortex | Q68970928 | ||
P4510 | describes a project that uses | Praat | Q378530 |
P407 | language of work or name | English | Q1860 |
P921 | main subject | electrophysiology | Q1154774 |
P304 | page(s) | 420 | |
P577 | publication date | 2014-05-13 | |
P1433 | published in | Frontiers in Psychology | Q2794477 |
P1476 | title | The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception | |
P478 | volume | 5 |