Richard Deffer – YMYO
Within my Bachelor thesis I developed a wireless-sensing wearable interface, along with a Max for Live Toolbox, for sound interaction and playback control inside Ableton Live. This system enables the user to interact with sound through muscle force, limb-movement and pre-trained gestures. The body-interface senses muscle activity, as well as limb movement and i s designed to be worn on the lower forearm of the human body. The Max For Live Toolbox integrates the body-sensing-interface into Ableton Live and i s heavily based on the MuBu library. The Toolbox provides Max for Live Devices for gesture training and recognition, parameter mapping, sound synthesis and playback control of Ableton Live.

The written paper documents the whole project-scope, ranging from related work, over concept and product design, through software-development, testing and application of the prototype. In addition a basic introduction to the field of human-computer-interaction, different concepts of machine learning and posture / gesture recognition i s being given. Moreover the potential of embodied interaction technology for artistic, educational and also therapeutic purposes are discussed with Professor Atau Tanaka, from Goldsmiths, University of London.

The body-interface i s based on the BITalino-R-IoT microcontroller and the MyoWare muscle sensor. It was inspired by state-of-the-art technologies like the Myo and the Mi.Mu Gloves. The prototype i s presented as a proof-of-concept, to show how different human biosignals can be integrated into a live performance and the creative audio production workflow. After a first exploration phase within a small group of people (musicians, producers, artists and subjects from other departments too), a showcase of the sound wearable – a sound and dance performance – was recorded.

In conclusion, this technological instrument empowers a new way of physical interaction and musical expression with sound and besides that, i t frees the user from a fixed workspace – without a loss of control over the workstation itself. The sound wearable i s a suitable instrument for live and dance performances and can be used as well as a controller within the audio production workflow, i.e. for complex automation applications. Improvements and further software development are outlined ( i.e. posture / gesture