University of Birmingham unveils platform for immersive virtual music collaborations

The JAMS platform allows musicians to engage in virtual concerts, practice sessions, and enhances music teaching by recreating the immersive experience of playing together in person.

 The software creates a responsive avatar that plays in perfect synchrony with the music partner. (photo credit: ARME project)
The software creates a responsive avatar that plays in perfect synchrony with the music partner.
(photo credit: ARME project)

The University of Birmingham announced the launch of the Joint Active Music Sessions (JAMS) platform, a virtual music collaboration tool designed to revolutionize the way musicians interact and perform together online.

The JAMS platform allows musicians to engage in virtual concerts, practice sessions, and enhances music teaching by recreating the immersive experience of playing together in person. Utilizing avatars created by individual musicians and shared with fellow performers, JAMS delivers an interactive virtual environment through VR headsets, bringing the musician's world to life.

Dr. Massimiliano (Max) Di Luca from the University of Birmingham explained: "A musician records themselves and sends the video to another musician. The software creates a responsive avatar that plays in perfect synchrony with the music partner. All you need is an iPhone and a VR headset to bring musicians together for performance, practice, or teaching."

On the platform, musicians can interact to learn, connect, perform, develop new music, and create virtual concerts that reach larger audiences. It has the distinct flavor of a platform developed with and for musicians, whether successful or at an early stage of learning.

The avatars on the JAMS platform capture unspoken moments that are key in musical performance. They allow practice partners or performers to watch the tip of a violinist's bow or make eye contact at critical points in the piece. This real-time adaptability and dynamic responsiveness deliver a unique, personalized experience.

By keeping faces at eye level and providing an immersive backdrop with realistic rendering of other musicians and cues used in real-life settings, the JAMS platform adds to the feeling of connectedness among musicians. Importantly, there is no "latency" in the JAMS user experience.

Dr. Di Luca elaborated: "Latency is the delay between a sound production and when it reaches the listener, and performers can start to feel the effects of latency as low as 10 milliseconds, throwing them 'off-beat', breaking their concentration, or distracting them from the technical aspects of playing."

The JAMS platform is underpinned by an algorithm created during the Augmented Reality Music Ensemble (ARME) project. This algorithm captures dynamic timing adjustments between performers. The ARME project brought together researchers from six disciplines: psychology, computer science, engineering, music, sport science, and maths. Input from these fields realized the vision of building a computational model that reproduces, with precision, a musician's body movements, delivering an avatar that meets the needs of co-performers.

"We're aiming to bring the magic of playing music in person to the virtual world. You can adapt the avatar that other people play with, or learn to play better through practice with a maestro," said Dr. Di Luca.

The article was written with the assistance of a news analysis system.