Magic Mitts
1st Place UTSA Tech Symposium • Affordable haptic VR glove with electromagnetic braking. 18% latency reduction, 24% comfort improvement.
Hey there! 👋🏾
I am a creative technologist and interdisciplinary engineer working across AI/ML, XR, and software systems. I build end-to-end pipelines and interactive prototypes, move fast from idea to working demo, and enjoy turning messy inputs (video, sensors, user interaction) into tools that are useful, meaningful, and actually work. I learn quickly, adapt when things get hard, and build whatever is needed to get the job done. MHCI @ CMU.
A selection of my most impactful work
1st Place UTSA Tech Symposium • Affordable haptic VR glove with electromagnetic braking. 18% latency reduction, 24% comfort improvement.
Modular pipeline converting unstructured video, audio, and sensor data into structured, timestamped events. Production-ready processing with Ray, FFmpeg, and MLflow.
UX research exploring how listeners perceive AI-generated music and build trust through transparent labeling.
Distress prediction from HealthKit biometrics using logistic regression. 82% test accuracy with CoreML on-device inference.
Audio-reactive 3D environments with gesture input. Quest 2 app built with Unity and Meta XR SDK.
CMU research: Multimodal XR prototypes studying how visual, auditory, and ambient cues influence pain perception.
My journey in tech and research
Spring 2026 – Present
CMU MHCI capstone project partnering with Amazon Music to explore and enhance the music discovery experience.
Spring 2026 – Present
Xbox x CMU Measuring Social: Research collaboration with Xbox exploring AI-driven approaches to measure and understand social interactions in gaming environments.
Sep 2025 – Present
Building multimodal XR prototypes to study pain perception. Unity/C# development with structured logging for future ML-driven personalization.
Summer 2025
Built end-to-end pipeline for gameplay score extraction from streaming videos. Parallel cloud ingestion (Ray), FFmpeg clipping, OpenCV processing, Databricks workflows (PySpark, MLflow). 60%+ faster processing.
2024
Built AI-powered technical interview platform with React/TypeScript canvas and FastAPI backend. Integrated LLMs with circuit topology for adaptive technical interviewer.
Aug 2023 – Dec 2024
Research on AI-empowered VR content analysis addressing harassment and safety issues. Built Unity/C# prototypes and explored supervised learning/LLM methods for behavior classification.
Jun – Aug 2024
8-week intensive covering EDA, feature engineering, model training/evaluation, deployment. Completed 35 projects including Health ML (Assuage), image classification, and CoreML deployment.
Summer 2022
Co-built Talky Talky, an audio-responsive web app supporting non-verbal children. Integrated Google Cloud Text-to-Speech APIs.
Academic background
Master of Human-Computer Interaction
Current
B.S. Computer Engineering
Minor in Computer Science
Apple NACME AIML Intensive
8-week bootcamp • 2024
Technologies I work with
Achievements and honors
Magic Mitts haptic VR glove project won top prize out of 89 teams. Recognized for innovation in affordable haptics and human-centered design.
Selected for intensive 8-week program at USC. Completed 35 projects covering ML fundamentals, deep learning, and iOS deployment.
Michael and Susan Dell Scholar, Horatio Alger Association Texas Scholarship, Apple Scholar Program, UTSA ECE scholarships.