Apple Music’s new “Transparency Tags” aim to flag AI-generated content – but labels have to self-report

Apple Music Logo

Apple Music is rolling out what it calls “Transparency Tags,” a system for flagging AI-generated content on its platform. Before celebrating the dawn of radical honesty in streaming, though, there’s a catch: the system appears to rely largely on record labels choosing to actually use it.

On Wednesday (4 March), Apple sent a newsletter to industry partners announcing that AI disclosures would now be a “delivery requirement” for content submitted to the service.

The tags cover four categories: Artwork, Track, Composition, and Music Video, each intended to indicate when AI contributed a “material portion” of the work.

“Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI,” the newsletter states. “We believe labels and distributors must take an active role in reporting when the content they deliver is created using AI.”

“These new tagging requirements provide a concrete first step toward the transparency necessary for the industry to establish best practices and policies that work for everyone.”
Exactly how that transparency will be enforced remains unclear.

Apple’s technical specification describes the tags as “optional” – at least for now – and the system does not appear to include any visible enforcement mechanism or verification process. “If omitted, none is assumed,” the notes state.

Apple Music notes on AI Transparency Tags
Credit: Apple

In practice, that likely means labels can tag AI-generated elements – be it a drum loop, lyric line, or album artwork – if they choose to disclose it. If they don’t, nothing changes.

Given the sheer scale of AI-generated uploads, that limitation could prove significant. Last September, Spotify introduced similar AI disclosure labels, alongside a policy allowing the removal of tracks with unauthorised AI generated voice clones.

Other platforms have also taken a more proactive approach. Deezer, for once, implemented an automated AI-detection system more than a year ago. The company says it now receives over 60,000 AI-generated songs every day, and its detection tools have identified more than 13.4 million AI-created tracks on the service.

Nevertheless, Apple’s Transparency Tags represent a step toward clearer disclosure – though relying on self-reporting alone is unlikely to slow the flood of AI-generated music.

The post Apple Music’s new “Transparency Tags” aim to flag AI-generated content – but labels have to self-report appeared first on MusicTech.

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *