Rabiat Sadiq

Hey there! 👋🏾

I'm Rabiat — I Build
XR × ML Magic

I am a creative technologist and interdisciplinary engineer working across AI/ML, XR, and software systems. I build end-to-end pipelines and interactive prototypes, move fast from idea to working demo, and enjoy turning messy inputs (video, sensors, user interaction) into tools that are useful, meaningful, and actually work. I learn quickly, adapt when things get hard, and build whatever is needed to get the job done. MHCI @ CMU.

Applied ML XR / Unity HCI Research Computer Vision Data Pipelines

Featured Projects

A selection of my most impactful work

View All →

Magic Mitts

1st Place UTSA Tech Symposium • Affordable haptic VR glove with electromagnetic braking. 18% latency reduction, 24% comfort improvement.

XR Hardware Unity

Spotify vs AI Research

UX research exploring how listeners perceive AI-generated music and build trust through transparent labeling.

HCI Research UX

Assuage — Health ML

Distress prediction from HealthKit biometrics using logistic regression. 82% test accuracy with CoreML on-device inference.

ML iOS Health

VR Music Visualizer

Audio-reactive 3D environments with gesture input. Quest 2 app built with Unity and Meta XR SDK.

XR Unity VR

XR Pain Perception

CMU research: Multimodal XR prototypes studying how visual, auditory, and ambient cues influence pain perception.

XR Research HCI

Experience

My journey in tech and research

  1. Project Manager & Engineer — Amazon Music Capstone

    Spring 2026 – Present

    CMU MHCI capstone project partnering with Amazon Music to explore and enhance the music discovery experience.

    UX Research ML Product
  2. AI Research Engineer — Xbox

    Spring 2026 – Present

    Xbox x CMU Measuring Social: Research collaboration with Xbox exploring AI-driven approaches to measure and understand social interactions in gaming environments.

    AI Research Gaming
  3. Research Assistant — CMU Augmented Perception Lab

    Sep 2025 – Present

    Building multimodal XR prototypes to study pain perception. Unity/C# development with structured logging for future ML-driven personalization.

    XR Research HCI
  4. Applied ML Intern — PlayStation (SIE)

    Summer 2025

    Built end-to-end pipeline for gameplay score extraction from streaming videos. Parallel cloud ingestion (Ray), FFmpeg clipping, OpenCV processing, Databricks workflows (PySpark, MLflow). 60%+ faster processing.

    ML Data Industry
  5. Co-founder / AI & Full-Stack Engineer — Applied STEM

    2024

    Built AI-powered technical interview platform with React/TypeScript canvas and FastAPI backend. Integrated LLMs with circuit topology for adaptive technical interviewer.

    ML Hardware Product
  6. AI/VR Research Assistant — UTSA

    Aug 2023 – Dec 2024

    Research on AI-empowered VR content analysis addressing harassment and safety issues. Built Unity/C# prototypes and explored supervised learning/LLM methods for behavior classification.

    Research XR ML
  7. Apple NACME AIML Intensive — USC

    Jun – Aug 2024

    8-week intensive covering EDA, feature engineering, model training/evaluation, deployment. Completed 35 projects including Health ML (Assuage), image classification, and CoreML deployment.

    ML iOS Health
  8. Google Software Product Sprint

    Summer 2022

    Co-built Talky Talky, an audio-responsive web app supporting non-verbal children. Integrated Google Cloud Text-to-Speech APIs.

    Product Web

Education

Academic background

Carnegie Mellon University

Master of Human-Computer Interaction

Current

HCI XR ML

University of Texas, San Antonio

B.S. Computer Engineering

Minor in Computer Science

Software Hardware AI

University of Southern California

Apple NACME AIML Intensive

8-week bootcamp • 2024

35 Projects Deep Learning

Alief Early College Highschool

Skills & Technologies

Technologies I work with

Machine Learning / CV

Python PyTorch scikit-learn OpenCV TensorFlow

Data Pipelines

Ray Databricks PySpark MLflow FFmpeg

XR / Unity

Unity (C#) Meta XR SDK VR Prototyping AR

Mobile / Cloud

Swift (iOS) Core ML HealthKit AWS GCP

Web / Full-Stack

React TypeScript FastAPI JavaScript

Hardware / Embedded

Microcontroller Sensor & Signal Processing ESP32 Sensors

Research Methods

User-Centered Research UX Evaluation HCI Research Experimental Design Data Collection Qualitative Analysis

Awards & Recognition

Achievements and honors

1st Place — UTSA Tech Symposium

Magic Mitts haptic VR glove project won top prize out of 89 teams. Recognized for innovation in affordable haptics and human-centered design.

Apple NACME AIML Intensive

Selected for intensive 8-week program at USC. Completed 35 projects covering ML fundamentals, deep learning, and iOS deployment.

Scholarships

Michael and Susan Dell Scholar, Horatio Alger Association Texas Scholarship, Apple Scholar Program, UTSA ECE scholarships.