Martin Daigle (McGill) & Pauline Patie (UdeM)
March 15th, 2023
Abstract:
Our project sought to explore the capabilities of cutting-edge machine learning (ML) techniques for real-time sound analysis in the context of a new composition. Led by Martin Daigle, Pauline Patie, and Emmanuel Lacopo, the project featured software such as Rodrigo Constanzo’s SP-Tools to provide real-time instrumental augmentation for the percussion and guitars. Furthermore, it was used to control real-time interactive visual projections created by Pauline.
Our main technical method to generate and perform this piece was SP-Tools which is based on the FluCoMa external to exploit large banks of sound samples with a focus on drum augmentation. These tools, combined with drum triggers, microphones, a direct input from guitars, and a marimba, were used to train the computer and generate a corpus of samples for each instrument which are differentiated and organized by their descriptors (loudness, pitch, spectralshape, mfccs, melbands).
Within a fraction of a millisecond, the corpus was compared with an active sound input and determined the closest sound and played it immediately with the same velocity. One of the first milestones was to create a corpus of sounds from all instruments involved and provide examples in the form of a performance video of all possible combinations (E.g.Drum kit playing the guitar corpus, guitar playing the marimba corpus, etc.). Moreover, the corpus was tested with a variety of musical genres including Metal, Jazz, Hip-Hop and other genres to observe the range of changes to the descriptors.
The main goal of this project was to generate a new piece that features and promotes the use of ML technology so we can perform this repertoire and demonstrate the power of these tools online and in concert settings.