DMA: Dancing with Music Accompaniment
A real-time mixed-reality system that synchronizes live music with intelligent, expressive virtual dance.


GPT-driven Dance Gesture Generation
DMA leverages large generative models to translate musical cues into expressive dance gestures, producing synchronized, stylistically consistent movement for virtual performers.
- Converts audio, MIDI, or score features into coherent dance gestures
- Models rhythm, intensity, and phrasing for nuanced movement
- Supports real-time generation for live music performance
- Optimized for Unity-based avatar rigs and animation pipelines
- Built for research on music–movement interaction and choreography

Real-Time Control & Synchronization
DMA provides a robust real-time control layer that aligns virtual dance movement with live music performance, supporting adaptive synchronization in interactive settings.
- Integrates score-following and tempo-tracking modules
- Smooth UDP-based control for XR devices and Unity applications
- Adaptive motion retiming to match performer’s expressive timing
- Low-latency design suitable for stage and mixed-reality shows

Immersive XR Experience
Built for modern XR platforms, DMA renders responsive, performance-ready virtual dancers that blend seamlessly with real-world stages and mixed-reality environments.
- Optimized for Quest3 and Unity XR Interaction Toolkit
- High-fidelity avatar animation with lighting & environment support
- Supports MR overlays for co-performance with real musicians
- Configurable pipelines for concerts, installations, and research demos