piano_prosthesis (live performance)
-
audio_file - pno_prosthBMIC.mp3
-
subject - Published Version
-
- Available under Creative Commons: Attribution-NonCommercial-No Derivative Works 3.0
This is an ‘on-the-fly’ musical collaboration between human and machine. The pianisit's improvisation is encoded as statistical behaviours that the computer assimilates by training, in real-time, a multiperceptron neural network (a model of human learning). The computer’s learning is not just data accrued, but is expressed in turn as music: the network maps its judgements to a library of sonic materials and stochastic behaviours. (This is the only ‘composed’ element of the performance). Recurring aspects of the player’s performance can then be recognised and ‘appraised’ by the network, which adapts to changes in the improvisation as it learns more about the player’s behaviour. So, the machine expresses its recognition and creative response to the player by developing, and modifying, its own musical output, just as another player might. Both ‘musicians’ adapt to each other as the performance develops. Pianist in this performance, Michael Young.
| Item Type | Performance |
|---|---|
| Keywords | live performance, interactive systems, artificial intelligence, music, improvisation |
| Subjects |
Mathematical and Computer Sciences > Artificial Intelligence Mathematical and Computer Sciences > Machine Learning Creative Arts and Design > Interactive and Electronic Design Creative Arts and Design > Music Creative Arts and Design > Types of Music |
| Departments, Centres and Research Units |
Music Music > Unit for Sound Practice Research |
| Date Deposited | 15 Mar 2013 09:35 |
| Last Modified | 06 Apr 2013 15:33 |