"Are you for real?" Using AI to detect AI in songs 🗣️
Progress on tools for detecting AI-generated or edited music. (audio; 5:02)
In Pillar 2 of our framework for ethical AI for music, we call out that traceability is key. Knowing which source tracks went into an AI-generated song is a big part of traceability and fair use. Doing this well will be technically challenging. Identifying if a song was AI-generated or not is a good starting point.
AI-generated and -edited music is flooding our online markets. MBW reported an Ircam Amplify estimate that in 2023 alone, 170 million AI-created tracks were uploaded to streaming platforms (e.g. Amazon, Apple, Spotify). How can we tell if a song was built with artificial intelligence, or only with human intelligence?
Machine Learning (ML) is commonly used for this kind of binary classification (yes=AI generated, no=not AI generated). This post is a quick look at some tools using AI / ML for detecting AI in songs, and some concerns. (If you’re not interested in detailed history, click here to jump ahead to the Bottom Line.)
Some History
On July 12, MajorLabl announced “a ground-breaking partnership with French tech firm Ircam Amplify”, which they say “positions us as the first company to block AI music from distribution.”
Although their blog has been active since Dec. 2020, Major Labl Artist Club appears to be a fairly new startup. Their announcement with Ircam Amplify positions them as a “start-up owned by independent artists”. They’re currently “raising and accepting minimum investments of £1,000”. Over 3000 artists have signed up for their waitlist so far. The positioning statement on their home page says: “Major Labl Artist Club is a disruptive, one-stop solution for independent artists, indie labels and managers. We provide everything you need to run your music career online, in one place. Make music your business and be your own major label.” The announcement also states that Major Labl is “part of the SEIS Scheme and Microsoft for Start-ups”.
Their partner, Ircam Amplify, announced their new “AI-Generated Detector” technology on May 6, 2024 (MBW ref). The company says it can “scan up to 5,000 music tracks in under a minute with an accuracy rate of 98.5%”.
Ircam Amplify is not the only company actively working on this technology. “AI Radar” was launched by Believe in November 2023, claiming 98% accuracy (ref). Believe was formed in 2005 and headquartered in Paris; they are publicly traded with ~1,651 employees in more than 50 countries.
A few months earlier, MBW reported TikTok announcing development of an automatic tool to help label AI-generated content on its platform. The tool is meant to detect whether AI was used to create or edit a track. It will also help TikTok to verify whether users are complying with their new community rules on AI use. (As we reported on July 1, TikTok parent ByteDance has multiple initiatives for developing software to let users generate music with AI.)
Even earlier in 2023, online music streaming platform Deezer was working on “radar” technology to detect AI-generated music. They were aiming towards basing artist compensation on whether AI was used in the track. Universal Music Group has been working with Deezer, as well as Tidal and others. Deezer was founded in 2006 and is also headquartered in Paris. They have ~600 employees in France, Germany, the UK, Brazil and the US. (US-based Tidal was acquired by Block in 2021.)
Bottom Line
Multiple firms are making technical progress on being able to identify AI-based music with high accuracy. However, with millions of tracks being uploaded to streaming services, even 98.5% accuracy will mean that thousands of tracks will be misclassified.
Some small percentage of AI-generated tracks will slip through undetected (false negatives). Without detection tools, 100% are slipping through today. So these tools offer a huge improvement.
Of more concern, thousands of non-AI tracks will be mistakenly flagged as AI (false positives). Being wrongly flagged as AI may mean that the content is blocked, or that it will be compensated at a lower rate. The musicians who created those non-AI tracks will be hurt unless some type of reasonable appeal process for false positives goes with the tools.
Hopefully the people working on these AI-based detection tools are preparing to quickly handle false positives while they keep working to reduce the false negatives.
References
Major Labl Artist Club website, LinkedIn company page
Ircam Amplify website, LinkedIn company page
Believe Global website, LinkedIn company page, Pitchbook profile
“UMG-Tidal Collaboration Extends to Deezer and Video-Sharing Services: ‘The Artist-Centric Approach Is Not Limited Solely to Streaming’”, by Dylan Smith / Digital Music News, 2023-03-03.
“Deezer Moves to Detect AI Music, Plans To Adopt ‘A Remuneration Model That Distinguishes Between Different Types of Music Creation’”, by Dylan Smith / Digital Music News, 2023-06-06.
“TikTok says it’s working on automatic AI content detection, as it rolls out new ‘AI-generated’ label for creators”, Daniel Tencer / Music Business Worldwide, 2023-09-19.
“AI of the Storm: Artificial Intelligence and the Music Industry in 2023”, Eamonn Forde / Synctank, 2023-12-13.
“Major Labl Fights Back to Protect Artists From AI Music Theft”, by Mark Knight (founder), 2024-07-12.