322
Audio & Video Production301
Software Development232
Automation & Workflow202
Marketing & Growth177
Writing & Content Creation182
AI Infrastructure & MLOps144
Design & Creative145
Data & Analytics110
Photography & Imaging140
Customer Support114
Voice & Speech124
Sales & Outreach108
Education & Learning116
Operations & Admin78
Deezer reports that AI-made tracks now make up 44% of new uploads, and it says most plays of those tracks are fraudulent and not paid out.
In short: Deezer says nearly half of the new music uploaded to its service is made by AI, and many of the streams appear to be fake.
Deezer, a music streaming service, says AI-generated tracks now represent 44% of all new music uploaded to its platform. That equals about 75,000 new AI-made tracks per day, according to the company.
Deezer says it built its own system to detect AI-generated audio, and it is one of the few services that labels this kind of content. Deezer also says it does not recommend AI-flagged tracks in its suggestions or in curated playlists, so most listeners are unlikely to run into them by accident.
Even so, Deezer reports that many of the streams linked to AI tracks are not from real listeners. The company says the main reason for uploading so much AI music is fraud, meaning people try to generate payouts by inflating play counts. Deezer says it only pays royalties when a real person listens, so it is “demonetizing” about 85% of streams on AI-flagged tracks, which means it does not pay money for those plays.
Deezer also shared survey results suggesting how hard it can be to spot AI music by ear. In its test, listeners heard three songs, two made by AI, and Deezer says 97% could not reliably tell which was which.
AI music tools are getting cheaper and easier to use, which could mean even more AI tracks uploaded across streaming services. Another key issue is labeling and enforcement, especially when AI “watermarks” (hidden markers, like a faint stamp inside the audio) can be removed.
Source: Arstechnica