CMU • Augmented Perception Lab
Multimodal XR Cues for Pain Perception
Investigating how body-aligned visuals, audio, and ambient cues in XR modulate perceived pain during controlled thermal stimulation. Deliverables include Unity prototypes, pilot study results, and analysis for potential ML-assisted adaptation.
Focus
Perception • Comfort • Adaptation
Methods
Pilot • Interviews • QST-aligned protocols
Stack
Unity/XR • Telemetry • Python
Research Questions
- How do body-aligned visual cues (motion, color) shift reported pain vs. baseline and distraction VR?
- Do additional sensory cues (audio/ambient) further attenuate or amplify perceived pain?
- Can simple ML signals help predict high/low pain states for adaptive cue delivery?
Approach
- Iterative XR scenes in Unity (baseline → distraction → augmentation), instrumented for timing & ratings (VAS/NRS).
- Small pilot sessions with standardized thermal stimulation and structured tasks.
- Lightweight telemetry for interaction events; optional physio proxies where appropriate.
Prototyping Notes
// Unity (C#) pseudo: body-aligned cue ramp
void Update(){
float t = Mathf.PingPong(Time.time * rampSpeed, 1f);
var color = Color.Lerp(coolColor, warmColor, t); // color ramp
armGlow.material.SetColor("_EmissionColor", color); // body-aligned emissive cue
}
Scenes progress from visual-only to multimodal (visual + audio/ambient). Comfort features include vignette-on-turn and adjustable locomotion.
Analysis & Measures
- Compare reported pain across conditions (baseline, distraction, augmentation).
- Timing of cue exposure vs. rating changes.
- Exploratory ML: simple classifiers distinguishing high/low states from session features.
Artifacts
Prototype Image Placeholder
Prototype Image Placeholder