Musicians spend a great deal of time practicing their instrument. As a result they develop a unique set of microgestures that define their personal sound: their acoustic signature. The same fluency they developed throughout the years on their instrument, is transferred to the digital domain when the virtual instrument is able to recognise and react to the player’s “already learned” gestures. Taking into account current interface development (Roli Seaboard and TouchKeys) , there is clearly the need to develop a digital instrument able to better convey expressiveness. The analysed interfaces have achieved this goal by modifying the keyboard interface in order to accommodate physical ways of transforming microgestures into control parameters. Thanks to the artistic insight provided by pianists and composers, together with gesture detection sensors and machine learning, this research, placing itself between Science and Art, aims to achieve a high level of expressivity without physically altering the keyboard instrument.
Read the latest blog post!