stub Real-Time AI Can Generate Lines for Live Instrumental Music - Unite.AI
Connect with us

Artificial Intelligence

Real-Time AI Can Generate Lines for Live Instrumental Music

Published

 on

Researchers in the Natural Language Processing Lab at the University of Waterloo have developed a real-time artificial intelligence (AI) system that can generate lines for live instrumental music. The new system is called LyricJam, and it went live in June 2021 with over 1,500 users testing it out since then.

The team will present their research at the International Conference on Computational Creativity in September. 

The lab is led by Olga Vechtomova, an Engineering professor who was cross-appointed in Computer Science at the university. Vechtomova has been developing AI applications for years, and the lab’s work initially led to the creation of a system that learns musical expressions of artists before generating lyrics in their style.

Vechtomova and Waterloo graduate students Gaurav Sahu and Dhruv Kumas also developed technology that utilizes different musical components like chord progressions, tempo, and instrumentation. The technology is able to synthesize lyrics so they reflect the mood and emotions expressed by the live music. 

The Neural Network

The system continuously receives raw audio clips as a musician or band performs instrumental music. After that, the neural network processes the data before generating new lyrics, which the artists can then use to develop their song lyrics.

“The purpose of the system is not to write a song for the artist,” Vechtomova says. “Instead, we want to help artists realize their own creativity. The system generates poetic lines with new metaphors and expressions, potentially leading the artists in creative directions that they haven't explored before.”

The newly developed neural network is able to learn what lyrical themes and words are associated with the different aspects of music, and most impressively, it does this in each single audio clip. 

The team carried out a user study where musicians played live music while using the system.

“One unexpected finding was that participants felt encouraged by the generated lines to improvise,” Vechtomova said. “For example, the lines inspired artists to structure chords a bit differently and take their improvisation in a new direction than originally intended. Some musicians also used the lines to check if their improvisation had the desired emotional effect.”

Partnering Up With AI

Another one of the major aspects of this research was its demonstration of collaboration and co-creativity between humans and AI. According to the participants, the system acted as a musical partner, and one that was uncritical, which enabled the musicians to play unfettered. They also said they felt encouraged to play the musical instruments even if they were not working on lyrics.

The new LyricJam system is the latest example of how artificial intelligence is making its way into our creative minds. While we always talk about the connection between humans and AI, it is often in terms of areas like health. With new advancements like these, we are getting closer to being connected to these machines in a creative way as well.

The LyricJam system can be found here.

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.