Interactive Machine Learning for More Expressive Game Interactions
Videogame systems incorporate varied sensors to increase the range of player interactions and improve player experience. However, implementing robust recognisers for player actions with sensors presents significant challenges to developers. Further, sensor-based controls offer little player customisation compared to traditional input interfaces (gamepads, keyboards and joysticks). Past research on motion-driven music systems has successfully used interactive machine learning (IML) techniques to facilitate the development and customisation of sensor-based interfaces, both by developers and end users. However, existing standalone software tools for IML are not ideal for use in game development and distribution. In order to support more effective and flexible use of sensors by game developers and players, we developed an integrated IML solution for Unity3D in the form of a visual node system supporting classification, regression and time series analysis of sensor data.
Item Type | Conference or Workshop Item (Poster) |
---|---|
Additional Information |
This research was supported by the Artist + Machine Intelligence “© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.” |
Departments, Centres and Research Units |
Computing Computing > Embodied AudioVisual Interaction Group (EAVI) |
Date Deposited | 29 Jul 2019 13:20 |
Last Modified | 10 Jun 2021 05:55 |
Explore Further
- http://ieee-cog.org/ (Organisation)