REACH : Raising co-crEAtivity in Cyber-Human musicianship
ERC Advanced Grant 2019
PI: Gérard Assayag
STMS Lab, Ircam, CNRS, Sorbonne Université
Digital cultures are increasingly pushing forward a deep interweaving between human creativity and autonomous computation capabilities of surrounding environments, modeling joint human-machine action into new forms of shared reality involving "symbiotic interactions”. In the artistic, cultural or educative fields, co-creativity between humans and machines will bring about the emergence of distributed information structures, creating new performative situations with mixed artificial and human agents. This will disrupt known cultural orders and significantly impact human development. Thanks to the computation of semantic structures from physical and human signals, combined with generative learning of symbolic representations, we are beginning to comprehend the dynamics of cooperation (or conflicts) inherent to cyber-human bundles. To this end the REACH project aims at understanding, modeling, and developing musical co-creativity between humans and machines through improvised interactions, allowing musicians of any level of training to develop their skills and expand their individual and social creative potential. Indeed, improvisation is at the very heart of all human interactions, and music is a fertile ground for developing models and tools of creativity that can be generalized to other activities, as in music the constraints are among the strongest to conduct cooperative behaviors that come together into highly integrated courses of actions. REACH will study shared musicianship occurring at the intersection of the physical, human and digital spheres as an archetype of distributed (natural / artificial) intelligence, and will produce models and tools as vehicles to better understand and foster human creativity in a context where it becomes more and more intertwined with computation.
Gérard Assayag, an IRCAM senior researcher (Directeur de recherche), has founded and currently heads the Music Representation team (RepMus) in the STMS Lab. STMS (Sciences and Technologies of Music and Sound) is a joint research unit co-operated by IRCAM, CNRS (Centre National de la Recherche Scientifique) and Sorbonne University in Paris, France. IRCAM (Institut de recherche et de coordination acoustique / musique) created by famous conductor and composer Pierre Boulez is the world’s largest institution dedicated to both music creation and sound/music scientific research. Assayag has been the Head of STMS Lab from 2011 to 2017 and, as such, involved in national and international research policies in Computational and Human Music Sciences, with a population of 125 people (researchers, engineers, techs, admin and PhD students) in his jurisdiction.
Assayag is a co-founder of Collegium Musicae, the Sorbonne University Institute for Music Sciences ; he has participated to the kick-off of the Sorbonne Institute for Artificial Intelligence (SCAI) where he has strongly contributed to define the Digital Humanity Axis programme. He has co-founded the French Society for Computer Music (AFIM) as well as the International learned Society for Mathematics and Computation in Music (SMCM) and its associated, first-of-kind, peer-reviewed Journal of Mathematics and Music (JMM), now a pillar of the research community in music computing.
Assayag has defined through theoretical publications and popular technologies (OpenMusic, OMax) the concept of symbolic interaction to account for rich and versatile musical dialog between machines and humans, traversing several levels and scales of information, from acoustic signal to higher symbolic and cognitive structures. His conceptions are now evolving towards general Co-Creativity in Cyber-Human Workship, aiming at reshaping the next generation human-machine artistic interaction.