Fragments #43-44 was born out of a collaboration between Numediart and myself, around mid-2012.
Numediart was working at the time on a system to classify data (sounds, images…) in a very fast and accurate manner, I was looking for new instruments to create sounds and compose music (as always).
For years I was into a mix of acoustic / “classic” instruments (guitar, bass guitar…), digital devices (sampler, FX generator, computer…) and alter “lutheries” (turntable, tape deck, reel-to-reel…).
At some point Numediart had a system that was allowing his user to “visualize” sounds (a point, and a waveform) and trigger it via the body (movement in a mapped space) facing a Kinect camera.
Video shot by Laura Colmenares Guerra
I immediately felt that there was “something”, a intriguing and interesting physical way to navigate through sounds. After several work sessions, I propose some possible extensions and development based on this idea, to transform this “virtual sound juke box” into a complex music instrument and installation. This creation of the prototype version was an intense marathon and exchange of ideas between Christian Frisson (Numediart engineer) and myself, based on the data sound classification engine developed by Stéphane Dupont.
Transcultures (Interdisciplinary Center for Sound and Electronic Cultures, Mons) showed rapidly some interest, and the prototype (Masht’Cycle) was booked for the 2013 edition from City Sonic (Trans-disciplinary Sound Art, Digital Art and Electronic Music Festival. in Mons).
After a few weeks of work, I felt that something was missing: some strong visuals replacing the interface from the audio engine. François Zajéga (who was working for Numediart at the time) was the man, I was really into his plastic work, into his unique blend of pure digital landscape and shapes, with an intense organic feeling.
MashtCycle was shown with this incredible added (the visuals) value with success at City Sonic Festival. At the time, I was able to trigger samples, make some loops, and had the opportunity to transform sounds via a series of Effects (oups, I don’t remember wich ones!) responding to a gesture grammar. A precise gesture had an variation or activation effects on sounds.
The Hypothesis and prototype was validate… Time to bring the system to the next level!
Francois Zajéga, visuals and data mapping: – frankiezafe.org
Yacine Sebti, audio engine – imal.org/yacine/
Loïc Reboursière, sound analysis.http://www.tcts.fpms.ac.be/homepage.php?Firstname=Loic&Lastname=REBOURSIERE
Fabien Grisard (Numediart), gesture recognition. http://www.tcts.fpms.ac.be/homepage.php?Firstname=Fabien&Lastname=GRISARD
Christian Frisson (Numediart), early prototype manager. http://tcts.fpms.ac.be/~frisson/
Stephane Dupont (Numediart), superviser. http://tcts.fpms.ac.be/~dupont/miscelaneous.html
special thanks to Thierry Dutoit (Numediart)http://tcts.fpms.ac.be/~dutoit/
Producers: Gauthier Keyaerts, Numediart, Commission des Arts Numeriques de la Federation Wallonie-Bruxelles and Transcultures.