Your Monday Music Briefing
Weekly industry news for self-label artists who choose ownership
Welcome to Your Monday Music Briefing, your weekly roundup of what caught my attention last week that felt important for self-label artists to know.
Sony Group Develops Tech to Track Original Music in AI-Generated Songs
Sony Group has reportedly developed technology capable of identifying copyrighted music embedded in AI-generated tracks. The system works two ways: when AI developers cooperate, Sony connects directly to their base model systems to extract training data. When they don’t cooperate, the system compares AI-generated output against existing catalogs to estimate which original works were used. The research was developed by Sony AI, a paper has been accepted at an international conference, and the company envisions the technology as the foundation of a revenue-sharing framework that would compensate creators based on how much their work contributed to an AI-generated song.
Sono Hikari Take: As much as I’d like to call this a clear win for human creators, the more important question is who actually benefits from a system like this. Sony is the world’s largest record label. When they build AI detection technology, it’s reasonable to assume it was designed around their catalog and their artists first.
There’s also a deeper problem here that the “30% Beatles, 10% Queen” framing exposes. Who wasn’t influenced by The Beatles? That kind of attribution framework may work for direct sampling, but music doesn’t work that way. Artists influence each other in overlapping, non-linear ways. Think about James Taylor, Carole King, and Joni Mitchell shaping each other’s sounds simultaneously in the early 70s. How granular does this system actually get, and who decides where influence ends and infringement begins?
Each major label will likely build its own version of this technology, and when that happens, the systems will reflect who funded them. Expect competing tools, competing standards, and competing claims over which artist the AI was actually influenced by.
For Self-Label Artists: Your music can end up in AI training datasets whether you opt in or not. Start thinking about your catalog as something that has value in this ecosystem, not just on streaming platforms. Register your songs everywhere they can be registered. Keep your metadata clean and current. If compensation frameworks do emerge from technology like this, the artists who can clearly prove ownership and document their catalog will be the ones positioned to benefit. You can’t collect what you can’t prove is yours.
This Machine Eats Music: Pushing Back Against AI in Art
The Nashville Scene published a deep look at the growing resistance to AI in music and art, examining how artists in Nashville and beyond are pushing back against the rapid normalization of AI-generated work. The piece explores the human cost of AI replacing session musicians, demo singers, and songwriters, and the philosophical questions around what it means to create art in an era when machines can produce something that sounds convincingly human. It also examines who benefits from this shift and who loses ground.
Sono Hikari Take: Nashville is the most songwriter-centric city in American music, which makes it a particularly sharp lens for this conversation. The artists speaking out in this piece are not technophobes or people afraid of change. They are working professionals watching a livelihood disappear in real time, often without the industry's acknowledgment that anything significant is happening. When AI can generate a demo in seconds for free, the entire infrastructure of how songs get made and paid for changes overnight. For self-label artists, the tools that lower production costs are real and useful. Worth sitting with, though, is the fact that those same tools are devaluing the craft that made the music worth making in the first place.
For Self-Label Artists: Think carefully about where you draw your own lines with AI tools. Using AI to streamline admin work, generate reference tracks, or spark ideas is one conversation. Using AI-generated music in your artist output without disclosure is a different one. Your audience trusts that when they listen to you, there is a human on the other side of that sound. That trust is your most valuable asset, and it is worth protecting.
Spotify Bets on AI Remixes as Pressure to Monetize Grows
On Spotify’s Q4 2025 earnings call, co-CEOs Gustav Söderström and Alex Norström laid out the company’s AI strategy in detail. Spotify reported 751 million monthly active users and paid out $11 billion to the music industry in 2025. Their AI roadmap has two categories: net new music created with generative tools, and “derivatives” of existing songs, including AI-powered remixes and covers. It’s the second category where Spotify appears most excited, describing it as “an untapped opportunity for artists to make money off of their existing IP.” The technology is already built. Licensing is the remaining hurdle. Norström stated directly that Spotify “will not do deals that aren’t good for artists.”
Sono Hikari Take: Spotify’s position as the world’s largest streaming platform gives them real leverage to shape what a “fair” AI deal looks like, and the framing of “artists choosing whether to participate” is a meaningful commitment if it holds. What I’m watching closely is the derivatives feature. Letting fans remix your catalog inside Spotify is a genuinely new revenue model, and for artists who want it, it could be valuable. For artists who don’t want their sound and catalog used to generate variations, the opt-in framing has to be real and clearly enforced. Platforms have a long history of making promises to artists that dissolve quietly when business priorities shift. Spotify’s pressure to monetize AI features will only increase from here.
For Self-Label Artists: Start paying attention to your distributor’s AI licensing settings. Some distributors are already prompting artists to opt in or out of AI training data licensing, sometimes in language buried in terms of service updates, without even letting you know. Know what you’ve agreed to and what you haven’t. When Spotify’s AI derivative features roll out, you will want to understand exactly what participation means for your catalog, your royalties, and your brand before you make a decision.
TikTok Launches Listening Party Feature With Apple Music: How to Host One
TikTok has launched a beta version of a Listening Party feature in partnership with Apple Music, allowing verified artists to host synchronized album listening sessions directly inside TikTok. Fans with active Apple Music subscriptions can hear full tracks in real time while chatting with the artist and each other. Non-subscribers can still join the chat and listen to 30-second previews. Artists can also gamify the experience by setting collective streaming milestones that unlock exclusive content. The feature is currently available to verified artists on TikTok’s iOS app, with the wider rollout still to be announced.
Sono Hikari Take: The discovery-to-listening pipeline has always been broken. A fan hears 15 seconds of a song on TikTok, gets pulled away, and the momentum dies before they make it to a streaming platform. This feature patches that gap by keeping fans inside TikTok while they stream the full track through Apple Music. From a pure artist-to-fan mechanics standpoint, that’s a real improvement. The Listening Party format also creates something meaningful: a shared experience around your music that feels intentional rather than algorithmic. The limitation right now is that it requires both TikTok verification and an Apple Music subscription on the fan side. That’s a smaller audience than it sounds. Curious how this evolves and whether Spotify, YouTube Music, or others get added as partners.
For Self-Label Artists: If you are already verified on TikTok and your music is on Apple Music, this feature is worth experimenting with around a new release. The interactive format rewards artists who have something to say about their music beyond the song itself. Behind-the-scenes stories, lyrics breakdowns, and real-time Q&As make the listening party something fans will actually show up for. Plan it the way you would plan a release event, and promote it across all your channels in advance. The metric to watch is whether fans who attend are converting to saves and follows on Apple Music, which you can track through Apple Music for Artists.
Deezer Refreshes Flow With Advanced Algorithm Upgrade
Deezer has introduced Flow Tuner, a major update to its flagship personalized streaming feature. The upgrade gives listeners direct control over their recommendation algorithm by allowing them to activate or deactivate specific genres and subgenres in real time. The changes reflect immediately in the Flow playlist without requiring likes, dislikes, or skips to signal preferences. Notably, Deezer’s updated Flow also excludes AI-generated tracks from recommendations by default, backed by the company’s proprietary AI detection tool. Deezer is now receiving over 60,000 fully AI-generated tracks per day and has been working to license its detection technology to the wider industry.
Sono Hikari Take: Two things stand out here. First, user-controlled algorithms are a meaningful shift. Platforms have spent years telling listeners that the algorithm knows best, and artists have been at the mercy of recommendation systems they cannot influence or understand. Giving listeners more control changes the dynamic and potentially opens up more room for niche sounds and smaller artists who speak to specific tastes. Second, Deezer’s decision to exclude AI-generated tracks from recommendations by default is a real stance. They are receiving 60,000 AI tracks daily. That number is growing. A platform that actively filters for human-made music in its core recommendation feature is making a values statement. Whether it moves the needle commercially remains to be seen, but it is worth knowing.
For Self-Label Artists: Deezer may not be your primary streaming platform, but genre tags and metadata accuracy matter across every DSP you are on. As recommendation algorithms become more sophisticated and more user-directed, the specificity of how you describe your music becomes more important. If a listener is actively curating for a particular mood, subgenre, or sound, your metadata is what determines whether you show up. Audit how your releases are tagged on every platform and make sure the descriptions match what you are actually making.
If You Missed It
5 Hacks to Start Operating Like Your Own Label
One Thing To Carry With You This Week
“The function of art is to do more than tell it like it is. It’s to imagine what is possible.”
— bell hooks







