flute_prosthesis (two studio recordings)
Two comparative studio recordings. This is an ‘on-the-fly’ musical collaboration between human and machine. The musician's improvisation is encoded as statistical behaviours that the computer assimilates by training, in real-time. The system maps its judgements to a library of sonic materials and stochastic behaviours. (This is the only ‘composed’ element of the performance). Recurring aspects of the player’s performance can then be recognised and ‘appraised’ by the computer, which adapts to changes in the improvisation as it learns more about the player’s behaviour. So, the machine expresses its recognition and creative response to the player by developing, and modifying, its own musical output, just as another player might. Both ‘musicians’ adapt to each other as the performance develops. This version uses a more efficient, feature-space model for machine learning. Coding is in Max/MSP.
The metaphor of prosthetic – rather than conversation – has a currency in debates about user-computer interaction; in this performance there is mutually prosthetic relationship between both collaborators, in both sound material and quasi-intentional behaviour.
Item Type | Audio |
---|---|
Keywords | interactive performance systems, music, improvisation, artificial intelligence |
Subjects |
Mathematical and Computer Sciences > Artificial Intelligence Mathematical and Computer Sciences > Machine Learning Creative Arts and Design > Interactive and Electronic Design Creative Arts and Design > Music Creative Arts and Design > Types of Music |
Departments, Centres and Research Units |
Music Music > Unit for Sound Practice Research |
Date Deposited | 15 Mar 2013 09:33 |
Last Modified | 06 Apr 2013 15:33 |
-
audio_file - take1.mp3
-
subject - Presentation
-
- Available under Creative Commons: Attribution-NonCommercial-No Derivative Works 3.0