Can Limitations of Visuospatial Attention Be Circumvented? A Review

scientific article published on 27 October 2017

Can Limitations of Visuospatial Attention Be Circumvented? A Review is …
instance of (P31):
scholarly articleQ13442814

External links are
P356DOI10.3389/FPSYG.2017.01896
P932PMC publication ID5665179
P698PubMed publication ID29163278

P2093author name stringPeter König
Basil Wahn
P2860cites workCoordinating spatial referencing using shared gazeQ82242582
Two heads are better than one: both complementary and synchronous strategies facilitate joint actionQ85636303
The pupillary light response reveals the focus of covert visual attentionQ21559547
Multisensory integration: flexible use of general operationsQ27000400
What failure in collective decision-making tells us about metacognitionQ27026278
Dual-task interference in simple tasks: data and theoryQ28241540
Contribution of striate inputs to the visuospatial functions of parieto-preoccipital cortex in monkeysQ28279013
Optimally Interacting MindsQ29468525
Humans integrate visual and haptic information in a statistically optimal fashionQ29614805
Capacity limits of information processing in the brainQ30048404
The interactions of multisensory integration with endogenous and exogenous attentionQ30364695
Inattentional Deafness: Visual Load Leads to Time-Specific Suppression of Auditory Evoked ResponsesQ30395553
Load-induced inattentional deafnessQ30395679
Concurrent brain responses to separate auditory and visual targetsQ30404373
Multisensory stimulation in stroke rehabilitationQ30415493
The experience of new sensorimotor contingencies by sensory augmentationQ30431274
Distracted and confused?: selective attention under loadQ30462783
Multiple resources and performance predictionQ30465103
Visual perceptual load induces inattentional deafnessQ30474086
Task-modulated "what" and "where" pathways in human auditory cortexQ30478108
Isolation of a central bottleneck of information processing with time-resolved FMRI.Q30483329
Vision and audition do not share attentional resources in sustained tasksQ30500948
Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual searchQ30525382
Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integrationQ30525384
Visual Distractors Disrupt Audiovisual Integration Regardless of Stimulus ComplexityQ30835564
Two Trackers Are Better than One: Information about the Co-actor's Actions and Performance Scores Contribute to the Collective Benefit in a Joint Visuospatial TaskQ33627926
Using fMRI to distinguish components of the multiple object tracking taskQ33642150
EEG correlates of attentional load during multiple object trackingQ33983343
Segregation of form, color, movement, and depth: anatomy, physiology, and perceptionQ34172282
Pupil Diameter and Load on MemoryQ34241926
Interactions among converging sensory inputs in the superior colliculus.Q34272216
Control of object-based attention in human cortex.Q34323349
Restricted attentional capacity within but not between sensory modalitiesQ34429883
The Mind-Writing Pupil: A Human-Computer Interface Based on Decoding of Covert Attention through PupillometryQ34512514
Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?Q34555998
Top-down control and early multisensory processes: chicken vs. eggQ35140767
Together, slowly but surely: the role of social interaction and feedback on the build-up of benefit in collective decision-makingQ35710277
Pupil dilation deconvolution reveals the dynamics of attention at high temporal resolutionQ36001199
Tracking multiple targets with multifocal attentionQ36161730
Learning New Sensorimotor Contingencies: Effects of Long-Term Use of Sensory Augmentation on the Brain and Conscious PerceptionQ36222091
Pupil Sizes Scale with Attentional Load and Task Experience in a Multiple Object Tracking TaskQ36225160
Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional ResourcesQ36661055
Multisensory integration: current issues from the perspective of the single neuron.Q37114556
Two is better than one: physical interactions improve motor performance in humansQ37507013
A taxonomy of external and internal attention.Q37540055
Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources.Q37632275
Crossmodal spatial attentionQ37730982
Synchronised and complementary coordination mechanisms in an asymmetric joint aiming taskQ37762816
Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitationQ38166110
Telephone conversation impairs sustained visual attention via a central bottleneckQ38385718
The absence of an auditory-visual attentional blink is not due to echoic memoryQ38393660
What vs. where in touch: an fMRI studyQ38416245
The COGs (context, object, and goals) in multisensory processingQ38756674
Hemispheric asymmetry: Looking for a novel signature of the modulation of spatial attention in multisensory processingQ38944137
Joint Action: Mental Representations, Shared Information and General Mechanisms for Coordinating with OthersQ39094268
A multisensory perspective on object memory.Q39235821
Action coordination in groups and individuals: Learning anticipatory controlQ40559257
Good vibrations: tactile feedback in support of attention allocation and human-automation coordination in event-driven domainsQ41731278
Swapping or dropping? Electrophysiological measures of difficulty during multiple object trackingQ41843095
Brain areas specific for attentional load in a motion-tracking taskQ43849947
Early visual and auditory processing rely on modality-specific attentional resourcesQ45339714
Why practice reduces dual-task interferenceQ46505358
How many objects can you track? Evidence for a resource-limited attentive tracking mechanismQ46903745
Pupil size signals mental effort deployed during multiple object tracking and predicts brain activity in the dorsal attention network and the locus coeruleusQ47176334
Multiple object tracking: anticipatory attention doesn't "bounce".Q48267211
Two attentional deficits in serial target search: the visual attentional blink and an amodal task-switch deficitQ48413944
Multi-modal distraction: insights from children's limited attentionQ48416338
Within-modality and cross-modality attentional blinks in a simple discrimination taskQ48582307
Space-based and object-based visual attention: shared and specific neural domainsQ48587048
Vision and Haptics Share Spatial Attentional Resources and Visuotactile Integration Is Not Affected by High Attentional LoadQ48656098
Brain activation during spatial updating and attentive tracking of moving targets.Q48742402
Distinct pathways involved in sound recognition and localization: a human fMRI studyQ48793878
Is motion extrapolation employed in multiple object tracking? Tracking as a low-level, non-predictive functionQ48939441
Tracking multiple independent targets: evidence for a parallel tracking mechanism.Q48951758
Task-evoked pupillary responses, processing load, and the structure of processing resources.Q48972766
Brain mechanisms of serial and parallel processing during dual-task performanceQ49046194
Pupil dilation reveals top-down attentional load during spatial monitoringQ49105907
Multielement visual tracking: attention and perceptual organizationQ49161539
The role of perceptual learning on modality-specific visual attentional effects.Q50474170
Seeing or hearing? Perceptual independence, modality confusions, and crossmodal congruity effects with focused and divided attention.Q50483297
When two heads are better than one: Interactive versus independent benefits of collaborative cognition.Q51771422
Multisensory enhancement of attentional capture in visual search.Q51861264
Behavioral evidence for task-dependent "what" versus "where" processing within and across modalities.Q51893305
Beyond sensory substitution--learning the sixth sense.Q52032883
Revisiting within-modality and cross-modality attentional blinks: effects of target-distractor similarity.Q52055724
Spatial attention and object-based attention: a comparison within a single task.Q52098194
A crossmodal attentional blink between vision and touch.Q52108492
Divided attention between simultaneous auditory and visual signals.Q52189247
Others' Actions Reduce Crossmodal Integration in Peripersonal SpaceQ58501926
Separate attentional resources for vision and auditionQ60213629
Manipulating inattentional blindness within and across sensory modalitiesQ63982555
Modality-specific auditory and visual temporal processing deficitsQ63982560
Speech shadowing while driving: on the difficulty of splitting attention between eye and earQ73382445
Processing of irrelevant visual motion during performance of an auditory attention taskQ74409636
Restricted attentional capacity between sensory modalitiesQ74672346
Cross-modality attentional blinks without preparatory task-set switchingQ78465361
Attentional capacity for processing concurrent stimuli is larger across sensory modalities than within a modalityQ79319230
Coordinating cognition: the costs and benefits of shared gaze during collaborative searchQ80581551
Visual-haptic cue weighting is independent of modality-specific attentionQ80816454
Joint action: bodies and minds moving togetherQ82235267
P407language of work or nameEnglishQ1860
P304page(s)1896
P577publication date2017-10-27
P1433published inFrontiers in PsychologyQ2794477
P1476titleCan Limitations of Visuospatial Attention Be Circumvented? A Review
P478volume8

Reverse relations

Q64241345How does navigation system behavior influence human behavior?cites workP2860

Search more.