McGill University and Eidos-Montreal have joined forces in a collaborative research project to record and implement spatial music in video games. The project, entitled Implementation of multiple synchronous Ambisonic recordings of music into a game engine, is funded through the Partnerships Program of the Office of Innovation and Partnerships, and supported by Quebec’s Ministère de l’Education and l’Enseignment Supérieur as a way to encourage collaborations between researchers and Quebec-based companies.
Led by Professors Wieslaw Woszczyk and Richard King at the Schulich School of Music, and Rob Bridgett at Eidos-Montreal, the project will demonstrate how spatial music can be implemented and experienced in interactive entertainment media. Key outputs expected from the collaboration will be a suite of tools, techniques and workflow for recording and playback support of multiple Ambisonic sound fields in a 3D game engine. The partners will also compose and integrate newly recorded musical content into a game environment and develop a game prototype displaying a highly interactive spatial presence of music.
In celebration of this new project, we asked Professors Woszczyk and King, as well as their collaborators, a few questions via email.
Q: Why is this partnership with Eidos-Montreal so important?
WW: This is our first collaboration with a video game developer, a partnership long overdue because of the huge importance of this industry. The ongoing convergence of media and computing technologies has created a user base with discernment of virtual reality and experience in interactive arts. Music has a great role to play in video games because it supports narrative and emotional engagement of a player who is an active listener able to change how music unfolds. Also, three-dimensional sound, which is the field of our expertise, is strongly supported by game creators, game console manufacturers, and audio equipment makers. So, having new and sophisticated tools, we get an opportunity to study how music could be composed, performed, recorded, and rendered to fully reflect the immersive nature of sound and music in 3D media.
RK: Eidos is a major force in the industry, and a local operator here in Montreal. Their research interests are completely aligned with ours, in that we are all looking to improve interactive immersive musical experiences. Teaming up with Eidos we have the opportunity to synchronize our music recordings with interactive video, in the context of a video game environment.
Q: How are Schulich’s new sound recording facilities valuable to your research?
WW: With the support of the university, government, and private individuals we have built an outstanding complex of audio research laboratories located in The Elizabeth Wirth Music Building of the Schulich School of Music. They reflect the state of the art in 3D sound capture, production and research, and are unprecedented in any academic environment. We are very proud of this achievement as it enables us to train top-notch practitioners and researchers ready to contribute to the industry.
RK: As our new labs come online, we have the ability to better investigate immersive capture including elevation, as the new rooms with their high ceilings are acoustically optimized for 3D recording and reproduction. Being members of CIRMMT (Centre for Interdisciplinary Research in Music Media and Technology), we also benefit from the unique research infrastructure developed with funding from CFI (Canada Foundation for Innovation), the enormous laboratory space known as the Multimedia Room.
How new is the concept of immersive audio and how do you expect it will develop in the next 10 years?
WW: Our own participation in immersive audio dates back 30 years with experiments in surround sound, and more recently in research on 22.2 channel sound in collaboration with the NHK STRL (the inventor of this 3D format). This includes the creation of immersive acoustics environments for music performance and recording. Immersive audio is the attribute of sound we perceive binaurally around us; it is native to our everyday auditory experience. Research and development of 3D audio technologies seeks to deliver better, more accurate, simpler method of recreating this natural phenomenon mediated by recording and reproduction.
RK: Well it is safe to say that immersive audio began in the 1940’s with the Disney film Fantasia! Of course, what we are now calling “3D sound” is really a newly established commercial product since 2005 or so. More and more, industry and academic researchers are working to make the experience more accessible, even through regular headphone listening! The most important development we will see over the next few years will be a vast improvement in audio quality – especially for interactive experiences where the listener is moving through a virtual space, with or without picture.
The next two questions are presented to Rob Bridgett, Eidos Montreal’s Senior Audio Director, who is the co-leader of this research project.
Q: How do you see the impact of spatial music on Video Games in general?RB: Audiences of video games are already fairly well used to 3D spaces and the navigation of those spaces through both visual and sound cues, much more so than cinema audiences. The one area of sound that has not yet fully evolved beyond the single fixed listener position of theatre and cinema, is music, and I think it is here that we can begin to create new experiences for an audience that is ready to accept and be comfortable with something as fundamentally different as being able to navigate inside a piece of music in 3D. The upcoming spatial audio for headphones from both Sony and Microsoft on their next generation consoles also gives me a good feeling that the spatial music that we are now able to record, can also find a mass market at a low barrier to entry (cost of console and headphones). I think the audience is there, as well as now the technology (which is always improving). We are just lacking the content and the storytelling techniques and uses for this kind of 3D spatial musical experience. I am particularly interested in developing and employing non-diegetic 3D space in music, and applying that to narratives, stories and experiences in games. I see particular success on the horizon for these techniques with more abstract moments, such as dreams or flashbacks, which are now becoming playable, rather than just linear cinematics.
Q: Do you see spatial music that is developed for games, going beyond games and into other areas?
RB: Yes. I especially see that game scores are becoming more and more popular as a genre, or category, of music in itself. Spatial music platforms, like Amazon Music and Tidal, are now emerging. Having native spatial music scores that are ready to be deployed on these spatial music services is a very exciting prospect. The emotions and the connections that players make between the music and their experiences in the game, could be accessible outside of the game world, in a music only context. I also think we could start to see audiences becoming more used to spatial music, and movement within the non-diegetic space, and this potentially influencing the use of music and the scope of music in both theatre and cinema mixes.
Let’s turn our final questions to two other members of the McGill research team involved in this outstanding collaboration: Florian Grond, Research Associate, and Jack Kelly, PhD Candidate. Dr. Florian Grond is an interdisciplinary researcher working in multimodal participatory design in the context of disability, the arts, and assistive technology with several years of experience in 3D sound recording and reproduction. Jack Kelly holds a Masters of Music degree in sound recording from the Schulich School of Music and is currently working on a PhD dissertation leading to a doctoral degree in sound recording.
Q: This project is very interdisciplinary. What types of sectors does it bring together?FG: Sound recording, and mixing, plays a role in many fields, which are only partially connected in terms of tools, technology and aesthetics. Mixing classical music productions has different artistic goals compared with creating sound effects for video games or recording for soundscape research with special microphone arrays. At the moment, we observe how these fields are converging. This is catalyzed through the current need to improve virtual collaborations and create remote yet immersive experiences. Novel technologies are further accelerating this trend, and, in this context, we are excited to be able to work with the 6 degrees of freedom capture and reproduction system of industry trendsetter ZYLIA. Their novel technology supports the artistic goals of this project. Q: How will this collaboration benefit McGill’s student researchers?
JK: This project is a unique learning opportunity for student researchers in the sound recording department. Immersive music production is uncharted territory in many ways. Doing it well requires a total re-imagining of the medium, and lots of trial-and-error. The tools and techniques that will be developed as a result of this collaboration will give students the ability to experiment with cutting-edge immersive recording and mixing technology. Given the push towards spatial audio and interactivity across all media sectors, this kind of ‘real world’ experience will be a huge asset on the job market.