Project Description
VISTA: Visual Inference for Spatio-Temporal Attention
An open-source community initiative for generating naturalistic, annotated video datasets to advance gaze tracking, cognitive assessment, and behavioral neuroscience.
๐ง Overview
VISTA is a framework and dataset initiative aimed at fostering more ecologically valid research in attention and cognition. We provide:
- AI-generated naturalistic videos simulating everyday scenes
- Human-curated scripts and interactions ensuring realism and task relevance
- Rich metadata annotations conforming to HED and FHIR standards
- Tools for researchers to create, customize, and annotate stimuli for cognitive and gaze-tracking experiments
This project supports explainable AI models and personalized neurocognitive assessment pipelines.
๐ Key Features
- Naturalistic Video Stimuli: Real-world inspired visual scenarios designed for ecological validity
- Spatio-Temporal Attention Labels: AOIs, moving targets, and time-locked cognitive events
- Scripted & Versioned Content: YAML-based scene scripting
- HED + FHIR Metadata: Research and clinical interoperability
- Custom Stimuli Toolkit: (coming soon) Interface and CLI to generate and annotate scenes
๐งช Applications
- Benchmarking gaze tracking algorithms
- Modeling attention and cognitive load
- Remote neuropsychological assessment
- Autism and neurodevelopmental screening
- Aging and cognitive resilience tracking
- Multiple Sclerosis and brain fog
๐ Citation
VISTA: Visual Inference for Spatio-Temporal Attention (2025).
An open-source dataset for naturalistic gaze tracking and cognitive testing.
ubc.neurocognition.ai
Research Classification
- Medical biotechnology diagnostics (including biosensors)
Research Interests
- Cognitive Neuropsychiatry
Research Methodology
- Neurocognitive assessment
- Gaze tracking
Faculty
Faculty of Medicine