Rabiat Sadiq

CMU • Augmented Perception Lab

Multimodal XR Cues for Pain Perception

Investigating how body-aligned visuals, audio, and ambient cues in XR modulate perceived pain during controlled thermal stimulation. Deliverables include Unity prototypes, pilot study results, and analysis for potential ML-assisted adaptation.

Focus

Perception • Comfort • Adaptation

Methods

Pilot • Interviews • QST-aligned protocols

Stack

Unity/XR • Telemetry • Python

Research Questions

Approach

Prototyping Notes

// Unity (C#) pseudo: body-aligned cue ramp
void Update(){
  float t = Mathf.PingPong(Time.time * rampSpeed, 1f);
  var color = Color.Lerp(coolColor, warmColor, t);    // color ramp
  armGlow.material.SetColor("_EmissionColor", color); // body-aligned emissive cue
}

Scenes progress from visual-only to multimodal (visual + audio/ambient). Comfort features include vignette-on-turn and adjustable locomotion.

Analysis & Measures

Artifacts

Prototype Image Placeholder
Prototype v0: body-aligned glow and color ramp.
Prototype Image Placeholder
Prototype v1: ambient shift + in-VR rating UI.

Proposal (PDF)

Back to Projects LinkedIn