Shanghai, China, Fall 2018, Research and Art Project
This project is looking to answer two questions through the course of this research:
Can a computer be taught to recognize "expert motion" in a wide variety of disciplines, such as dance, calligraphy, and musical performance? If so, what are the applications of this data and how might it be used to expand access and availability?
How can a Mixed Reality headset such as the Magic Leap be used to better train and provide feedback to those studying the disciplines in the first question above?
Shenzhen, China, Fall 2017, Global CRE8 Summit
An immersive outerspace VR theater experience 沉浸式虚拟宇宙体验
Using m3diate’s networked virtual reality technology, audience members are able to see, hear, touch and interact not only with their friends in the room, but also with anyone in the world in real time as they traverse the wonders of the cosmos with a live motion-captured guide.
//Mediums:
theater
live music
//Technologies:
networked VR
real-time projection mapping
optical motion capture
spatial reactive audio
New York, New York, Spring 2017, Tribeca Film Festival
HamletVR is an interactive virtual reality theater experience, where audience members at the Tribeca Film Festival and across the world participated in the world's first live VR theater where the motion and voice of live actors were networked in a virtual world where participants could watch and explore Shakespeare from an entirely new point of immersion.
//Mediums:
theater
//Technologies:
networked VR
networked optical motion capture
spatial audio
Press:
Shanghai, China, Fall 2017, International Computer Music Conference
Project Neuron was a collaboration between friends at the Shanghai Conservatory of Music. We began with the theme from the conference - hearing the self - and imagined what it would be like to visualize the entire neurological process involved in hearing sound. Furthermore, we wanted to combine the fact that everyone hears themselves differently than others do. The end result was a 5 minute live dance and motion-capture performance where each dancer was able to both impact the visuals and the music.
//Mediums:
dance
music
motion graphics
//Technologies:
optical motion tracking
projection
OSC and UDP networked audio interfaces
game engines
Press:
Berlin, Germany, Summer 2016, Tech Open Air
Bongos is a virtual reality audio production experiment where the user is able to control and experience an Ableton Live mix where the sound from each track is accurately spatialized according to the location of objects in the space relative to the listener.
//Mediums:
music
//Technologies:
virtual reality
motion controllers
spatialized HRTF audio
OSC and UDP networking