Interested in purchasing a Module kit for some home studio DIY? We are offering kits through the following link, Nanotopian.
Or sign up for one of our upcoming workshops at InterAccess, or co:Lab!
Collaborate sonically with the non-human organisms around you. Next time you go on a picnic bring along your Bio-sonification Module & listen to the sounds of the forest, how it responds to your presence then, place the electrodes onto your friends or hook up your houseplant and see how it really feels!
Students receive step by step guidance putting together their own Bio-Sonification Module kit and help in understanding how the device works.
Last half of the workshop participants place electrodes onto plants, fungi, themselves or each other and listen to their Bio-data through Ableton Live, Animoog, Model 15 (or similar digital music apps and analog synths) creating a Bio-Sonification Symphony!
Various plants, soil, and Mycelium will be available during the workshop for participants to place their electrodes onto along with iConnectivity Midi to USB/Lightning connectors for participants to try out different instruments/software; Ableton, Animoog on iPad, Model 15 (on iPhone & iPad), Moog Mother32s, Make Noise NoCoast, etc. Other possibilities are shown and discussed, for instance, Ableton Max/MSP Midi video disruption!
Each participant receives their very own Bio-Sonification Module kit, a Module container file that they can 3D print offsite and a lo-tech version we will put together during the workshop, a set of electrodes and electrode pads or electrode clips (participants can choose during the workshop). The Bio-Sonification Modules were created by Tosca Teran with the assistance of Manuel Domke and Sam Cusumano!
Prerequisite: Basic knowledge of soldering is very important. Here is a link to the Mighty Ohm’s, fullsoldercomic_en Some Ableton Live knowledge is helpful, but not necessary!
“Bio-sonification,” basically means using technology to turn the bio-rhythms of natural objects into sound.
Biodata Sonification is a process to translate complex real-time sensor data into musical notes and controls, exploring the auditory sensory modality to provide insights into invisible phenomenon.