Universal Music Group and Nvidia Collab on AI Music
Universal Music Group and Nvidia announced on 6 January they will be combining their assets 'to pioneer responsible AI for music discovery, creation, and engagement'.
Basically what this means is UMG will allow Nvidia to scan its entire music catalog, which is millions of songs, and train its Music Flamingo AI model to capture harmony, structure, timbre, lyrics, and cultural context. Flamingo uses chain-of-thought reasoning to enable nuanced interpretation of musical elements, from chord progressions to emotional arcs.
They are going to then build new AI powered tools which 'will empower established artists to engage with their fans in deeper, more interactive ways, while also providing emerging artists with unprecedented opportunities to be discovered and connect with new audiences worldwide.' eh?
They are justifying using artists work to train an AI model by creating a 'dedicated artist incubator'. Who is in this new network of people that are testing these new AI powered tools? That's not clear but this incubator will have the input where it matters for the future of music creation...and the industry.
They are wrapping this whole experiment up as responsible innovation for the music industry to hold back on the AI slop that is being created and 'where technology and creativity unite to unlock new possibilities for artists and fans everywhere.'
However ultimately what we can see from this collaboration is that UMG with the help from Nvidia have identified they need to work out how music will be created and consumed in the near future before they lose control of their own market and their shareholder value.