PROXIE logo

Perceptual Realities

Optimizing XR through Perceptually-Informed Experiences

ERC-funded • Universidad de Zaragoza

What is PROXIE?

Extended Reality (XR)—including Virtual, Augmented, and Mixed Reality—offers unique opportunities to study and optimize human perception in immersive environments. Building on strong prior work on visual, auditory, and multisensory perception, PROXIE investigates perception in ecologically valid XR scenarios, where users perform complex, realistic tasks.

Our approach

We combine psychophysical experiments and computational modeling to develop adaptive perceptual models for XR. Experiments systematically vary task type (free exploration, memory recall, search), environment complexity, and multisensory conditions (vision, audition, proprioception).

  • Eye-tracking — gaze allocation and scanpaths.
  • Physiological signals — electrodermal activity, ECG, heart rate variability.
  • Behavioral responses — motion traces, task performance.
  • Subjective measures — presence, workload, engagement.

These multimodal datasets will feed into machine learning models (transformers, RNNs, probabilistic formulations) that predict attention, multisensory synchronization thresholds, and visual quality perception under varying task demands.

Research focus

  • 👁️Attention — Saliency and gaze dynamics across tasks and environments, integrating physiological markers of cognitive load.
  • 🔊Multisensory mappings — Psychometric quantification of detection thresholds for audio–visual and proprioceptive misalignments, enabling perceptually grounded manipulations (e.g., redirected walking, hand motion adjustments).
  • 🎨Visual perception — Adapting state-of-the-art quality metrics (e.g., ColorVideoVDP, FovVideoVDP) and developing task-dependent appearance metrics for materials (gloss, translucency).
  • 🧪Applications & Ecological validity — Validation in XR use cases: storytelling, training/education, telepresence, and collaborative design.

Impact

The project will produce:
  • Large multimodal datasets for XR perception research.
  • Adaptive perceptual models that account for task, environment, and multisensory context.
  • Insights transferable to XR system design, from rendering to interaction techniques.

Team

The project is led by Ana Serrano. You will join the Graphics & Imaging Lab, at Universidad de Zaragoza (Spain).

Find more about how to apply for a position

👉 ana-serrano.github.io/PROXIE