The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception

scientific article

The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception is …
instance of (P31):
scholarly articleQ13442814

External links are
P356DOI10.3389/FPSYG.2014.00420
P932PMC publication ID4026678
P698PubMed publication ID24860533
P5875ResearchGate publication ID262608472

P50authorMarc SatoQ115599483
P2093author name stringCoriandre Vilain
Avril Treille
P2860cites workNeural correlates of multisensory integration of ecologically valid audiovisual eventsQ81345936
Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactionsQ87297473
EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysisQ29547481
Hearing lips and seeing voicesQ29619186
Speech through ears and eyes: interfacing the senses with the supramodal brainQ30452048
The natural statistics of audiovisual speechQ30488184
Bimodal speech: early suppressive visual effects in human auditory cortexQ30498576
Visual speech speeds up the neural processing of auditory speechQ30499596
The N1 wave of the human electric and magnetic response to sound: a review and an analysis of the component structureQ34185761
Listening with eye and hand: Cross-modal contributions to speech perceptionQ34762575
Cortical oscillations and sensory predictions.Q38017454
Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integrationQ38453403
Effects of phonetic context on audio-visual intelligibility of French.Q40570808
Tactile enhancement of auditory and visual speech perception in untrained perceivers.Q43134903
Perception of visible speech: influence of spatial quantizationQ43684160
Temporal window of integration in auditory-visual speech perceptionQ44437600
Analytic study of the Tadoma method: background and preliminary resultsQ44796227
Auditory-tactile speech perception in congenitally blind and sighted adultsQ47260428
Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perceptionQ48124103
Evidence of a visual-to-auditory cross-modal sensory gating phenomenon as reflected by the human P50 event-related brain potential modulationQ48334765
Sequential audiovisual interactions during speech perception: a whole-head MEG studyQ48386751
Dual neural routing of visual facilitation in speech processing.Q48422204
Auditory event-related potentials (ERPs) in audiovisual speech perceptionQ48500065
Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuliQ48533817
Seeing speech: visual information from lip movements modifies activity in the human auditory cortexQ48707959
Electrophysiological evidence for speech-specific audiovisual integration.Q48832369
Does audiovisual speech offer a fountain of youth for old ears? An event-related brain potential study of age differences in audiovisual speech perceptionQ50434499
Hearing lips in a second language: visual articulatory information enables the perception of second language sounds.Q50468567
Seeing to hear better: evidence for early audio-visual interactions in speech identification.Q52089490
Visual Contribution to Speech Intelligibility in NoiseQ56225354
The Perception-for-Action-Control Theory (PACT): A perceptuo-motor theory of speech perceptionQ57461602
Evoked dipole source potentials of the human auditory cortexQ68970928
P4510describes a project that usesPraatQ378530
P407language of work or nameEnglishQ1860
P921main subjectelectrophysiologyQ1154774
P304page(s)420
P577publication date2014-05-13
P1433published inFrontiers in PsychologyQ2794477
P1476titleThe sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
P478volume5