Virtual Touch, Virtually Dancing - Goldsmiths Mocap Streamer Showcase Performance
Supported by the AHRC, within their ‘Ideas that address Covid19’ fund, and by Goldsmiths ‘Future of Media’ research theme, this immersive showcase shows three remote performers dancing in real-time using the mocap streamer within a colourful virtual landscape, with avatars made of geometric shapes, lights and particles.
The second half of this showcase includes Q&A’s with Dan Strutt, Neal and Clemence about the creative and technical development of this tool and the assets created for the showcase.
Item Type | Performance |
---|---|
Additional Information |
Please note that the slight jerkiness and sync issues were due to our YouTube live-stream - not the data stream! (which is perfect) Supported by the AHRC, within their ‘Ideas that address Covid19’ fund, and by Goldsmiths ‘Future of Media’ research theme. Principal Investigator is Dr. Dan Strutt Creative Direction and Virtual Environments by Neal Coghlan (Studio Aszyk) and Clemence Debaig. Software development by Oliver Winks at Paper Plane Software Research Assistant: Friendred Peng. Visuals crafted in Unity, with motion capture by the Perception Neuron V2 by Noitom Technology Ltd. with technical support from our project partner Target3D. Related URLs: https://www.mocapstreamer.live/virtual-touch-virtually-dancing, https://www.gold.ac.uk/calendar/?id=13593 |
Keywords | digital dance, motion capture, streaming |
Departments, Centres and Research Units | Media and Communications |
Date Deposited | 12 Aug 2022 08:57 |
Last Modified | 12 Aug 2022 08:57 |