Press, PR & Media
LOERIC: When computers become performers

AI models that generate music are nothing new. But how can an AI agent perform music live with another human?
LOERIC is a music performance system for Irish traditional dance music created by Marco Amerotti, designed to be highly customizable and reactive in real time to another co-musician. It looks at a tune’s score and then performs it using dynamics, ornamentation, and timing characteristic of Irish traditional music practice while continuously adapting to the performance of a human musician with various degrees of independence.
The idea for this project first arose at the MUSAiC Festival in Stockholm in 2022. Professor Steve Benford (University of Nottingham) accompanied on guitar pre-recorded renditions of AI-generated tunes using Professor Bob Sturmʼs (KTH Royal Institute of Technology) FolkRNN model. However, while the human sounded musical and expressive, the computer playback felt flat and uninteresting. Marco Amerotti, at the time a Masterʼs student in Stockholm, started experimenting with ways of capturing the humanʼs performance to steer the computer-generated playback. A few months later, under the supervision of Sturm, Benford and Professor Craig Vear (University of Nottingham), LOERIC was born.
LOERIC uses a set of explicit musical rules derived from experts and practice to make performance decisions while playing a tune. Every note is augmented in real-time to reflect what a human musician would do and output via MIDI. This makes the system immediately compatible with all sorts of audio software, and allows for a variety of sounds and synthesizers when playing.
If a human interacts with it (through MIDI control changes, audio input, control pedals etc), LOERIC can be set up to mimic (or invert) the musicianʼs behaviour, for example it can play loud if you play loud, or back off if you are taking the lead, play more ornaments and faster if the performance becomes more intense, etc.
LOERIC is never a complete system: it is constantly “learning” (even though there is no machine learning involved at this stage), from the musicians it interacts with by constantly incorporating their feedback, integrating new features, and refining its configuration. For example, the first version of the system had no swing, very basic ornamentation, it was monophonic and instrument agnostic. The current version features a variety of programmatically defined ornaments, swing, droning, instrument-specific ornamentation, arbitrary rhythmic patterns, and much more.
LOERIC has also travelled to Ireland. We took it to the Irish World Academy of Music and Dance in Limerick, where it played with 15 expert musicians and it learned how to play specific tune types (e.g., highlands and slides).
The latest development of the system is The Virtual Session. We modified LOERIC to be able to play not only with a human, but with multiple versions of itself, each with its own configuration and style, simulating an Irish music session (i.e. a gathering of musicians, usually in a pub, having drinks and playing tunes together).
But as stated before, this is not the end of the LOERIC journey. We are planning to collaborate with, and looking for, more professional musicians to create a mixed human-AI performance, create augmented instruments that feature, or are in part controlled by LOERIC, implement new features, and learn new styles.
Through LOERIC, we can explore the field of performance modelling for traditional music and practice, and look at the interactions between AI and traditional communities in a responsible way, identifying frictions and conflicts as well as contact points and opportunities for engagement.
Written by Marco Amerotti
LOERIC research falls within the Intelligent Instruments project