Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources

scientific article published on 8 March 2016

Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources is …
instance of (P31):
scholarly articleQ13442814

External links are
P356DOI10.3389/FNINT.2016.00013
P932PMC publication ID4781873
P698PubMed publication ID27013994
P5875ResearchGate publication ID297662608

P50authorPeter KönigQ41048973
P2093author name stringBasil Wahn
P2860cites workThe combination of vision and touch depends on spatial proximityQ24644193
Pip and pop: nonspatial auditory signals improve spatial visual search.Q51948584
Multisensory cues capture spatial attention regardless of perceptual load.Q51968681
Audiovisual integration of speech falters under high attention demands.Q51992161
Beyond sensory substitution--learning the sixth sense.Q52032883
Revisiting within-modality and cross-modality attentional blinks: effects of target-distractor similarity.Q52055724
A crossmodal attentional blink between vision and touch.Q52108492
Dividing attention between the color and the shape of objects.Q52189805
The ventriloquist effect does not depend on the direction of automatic visual attention.Q52932774
Attention to touch weakens audiovisual speech integrationQ56269634
Separate attentional resources for vision and auditionQ60213629
Modality-specific auditory and visual temporal processing deficitsQ63982560
Restricted attentional capacity between sensory modalitiesQ74672346
Cross-modality attentional blinks without preparatory task-set switchingQ78465361
Attentional capacity for processing concurrent stimuli is larger across sensory modalities than within a modalityQ79319230
Spatial constraints on visual-tactile cross-modal distractor congruency effectsQ80813502
Visual-haptic cue weighting is independent of modality-specific attentionQ80816454
Selective attention and multisensory integration: multiple phases of effects on the evoked brain activityQ81065006
Humans integrate visual and haptic information in a statistically optimal fashionQ29614805
The experience of new sensorimotor contingencies by sensory augmentationQ30431274
Effect of attentional load on audiovisual speech perception: evidence from ERPsQ30434036
Merging the senses into a robust perceptQ30462778
Distracted and confused?: selective attention under loadQ30462783
Sensory augmentation for the blindQ30469717
Vision and audition do not share attentional resources in sustained tasksQ30500948
Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual searchQ30525382
Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integrationQ30525384
Attention modulates visual-tactile interaction in spatial pattern matchingQ34155969
Interactions among converging sensory inputs in the superior colliculus.Q34272216
Where's the action? The pragmatic turn in cognitive science.Q34340729
Restricted attentional capacity within but not between sensory modalitiesQ34429883
Orienting of spatial attention and the interplay between the senses.Q34987843
Frontoparietal cortical networks for directing attention and the eye to visual locations: identical, independent, or overlapping neural systems?Q36050507
Covert orienting of attention and overt eye movements activate identical brain regionsQ36646375
Disintegration of multisensory signals from the real hand reduces default limb self-attribution: an fMRI studyQ37093865
Multisensory integration: current issues from the perspective of the single neuron.Q37114556
A taxonomy of external and internal attention.Q37540055
Crossmodal spatial attentionQ37730982
The neural basis of attentional control in visual searchQ38220305
The absence of an auditory-visual attentional blink is not due to echoic memoryQ38393660
Modality-specific selective attention attenuates multisensory integrationQ38395716
Predictive coding and multisensory integration: an attentional account of the multisensory mindQ38413875
Top-down attention regulates the neural expression of audiovisual integration.Q40789113
Good vibrations: tactile feedback in support of attention allocation and human-automation coordination in event-driven domainsQ41731278
How many objects can you track? Evidence for a resource-limited attentive tracking mechanismQ46903745
Pupil size signals mental effort deployed during multiple object tracking and predicts brain activity in the dorsal attention network and the locus coeruleusQ47176334
Cross-modal congruency and visual capture in a visual elevation-discrimination taskQ48162361
Oscillatory signatures of crossmodal congruence effects: An EEG investigation employing a visuotactile pattern matching paradigmQ48245023
Two attentional deficits in serial target search: the visual attentional blink and an amodal task-switch deficitQ48413944
Parallel and serial processes in visual searchQ48417770
Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration?Q48535299
Within-modality and cross-modality attentional blinks in a simple discrimination taskQ48582307
Vision and Haptics Share Spatial Attentional Resources and Visuotactile Integration Is Not Affected by High Attentional LoadQ48656098
Behavioral evidence for task-dependent "what" versus "where" processing within and across modalities.Q51893305
Processing of multisensory spatial congruency can be dissociated from working memory and visuo-spatial attention.Q51899198
Spatial selection and target identification are separable processes in visual search.Q51911092
P4510describes a project that usesggplot2Q326489
P921main subjectattentionQ6501338
P304page(s)13
P577publication date2016-03-08
P1433published inFrontiers in Integrative NeuroscienceQ15817251
P1476titleAttentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources
P478volume10

Reverse relations

cites work (P2860)
Q37730953Assessing the Role of the 'Unity Assumption' on Multisensory Integration: A Review
Q89964757Audiovisual Integration During Joint Action: No Effects for Motion Discrimination and Temporal Order Judgment Tasks
Q37632275Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources.
Q30819362Bayesian Alternation during Tactile Augmentation
Q44412092Can Limitations of Visuospatial Attention Be Circumvented? A Review
Q38944137Hemispheric asymmetry: Looking for a novel signature of the modulation of spatial attention in multisensory processing
Q33694121Humans treat unreliable filled-in percepts as more real than veridical ones
Q34555998Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?
Q39094268Joint Action: Mental Representations, Shared Information and General Mechanisms for Coordinating with Others
Q36222091Learning New Sensorimotor Contingencies: Effects of Long-Term Use of Sensory Augmentation on the Brain and Conscious Perception
Q36225160Pupil Sizes Scale with Attentional Load and Task Experience in a Multiple Object Tracking Task
Q37413416The Attentional Dependence of Emotion Cognition Is Variable with the Competing Task
Q33627926Two Trackers Are Better than One: Information about the Co-actor's Actions and Performance Scores Contribute to the Collective Benefit in a Joint Visuospatial Task

Search more.