In 2026, as generative AI music experienced explosive growth, the streaming giant Spotify finally launched a formal battle against "AI Slop." According to recent industry news, Spotify is testing a new feature called "Artist Profile Protection." This initiative aims to return the control of content upload to real artists, preventing a large number of AI-generated fake songs from polluting their official profiles.
For a long time, streaming platforms have been troubled by "wrongly uploaded songs," and the popularity of AI technology has made malicious impersonation even worse.
Active Review Mechanism: Artists who enable this feature will receive system notifications whenever any track attributed to them is submitted to the platform.
Self-Approval: Only tracks approved by the artist themselves will appear on their personal profile, be counted in playback data, and be included in user recommendation algorithms (such as Release Radar).
Precise Coverage: This feature is currently prioritized for creators with common names or those frequently affected by AI infringement. They can set it up through a dedicated dashboard.
Just a week ago, Sony Music revealed that it had requested the removal of over 135,000 AI-generated songs impersonating its artists. Spotify's move is seen as a direct response to pressure from copyright holders.
Due to the open distribution system, some users take advantage of incorrect metadata or name confusion to attach low-quality AI songs to the names of famous artists to "ride on the traffic." This not only dilutes the artist's brand value but also seriously disrupts the listening experience for fans.


