Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesEarnWeb3SquareMore
Trade
Spot
Buy and sell crypto with ease
Margin
Amplify your capital and maximize fund efficiency
Onchain
Going Onchain, without going Onchain!
Convert & block trade
Convert crypto with one click and zero fees
Explore
Launchhub
Gain the edge early and start winning
Copy
Copy elite trader with one click
Bots
Simple, fast, and reliable AI trading bot
Trade
USDT-M Futures
Futures settled in USDT
USDC-M Futures
Futures settled in USDC
Coin-M Futures
Futures settled in cryptocurrencies
Explore
Futures guide
A beginner-to-advanced journey in futures trading
Futures promotions
Generous rewards await
Overview
A variety of products to grow your assets
Simple Earn
Deposit and withdraw anytime to earn flexible returns with zero risk
On-chain Earn
Earn profits daily without risking principal
Structured Earn
Robust financial innovation to navigate market swings
VIP and Wealth Management
Premium services for smart wealth management
Loans
Flexible borrowing with high fund security
Spotify is reportedly flooded with covers from AI bands

Spotify is reportedly flooded with covers from AI bands

CryptopolitanCryptopolitan2024/08/26 16:00
By:By Enacy Mapakame

Share link:In this post: AI “bands” are increasingly sneaking music covers on streaming service Spotify. The bands cover music from country classics to other different genres across decades. However, while streaming services may pay royalties for all music including covers, original songwriters get low payments in royalties compared to cover producers.

Spotify, a live streaming music platform, is reportedly flooded with music covers by AI “bands” sneaked onto the platform’s playlists. The AI covers are usually hidden among other large and publicly available covers by real artists, allowing them to attract millions of listeners and get paid.

Unlike other artists, the so-called artists for these covers, have no digital footprint on social media platforms, which has raised red flags about the originality of their covers. Additionally, their bios sound like they were generated on AI platforms like OpenAI’s ChatGPT.

Vigilant Spotify listeners raised the alarm

According to a Slate report , a group of people on Reddit discovered the suspicious trend and brought it to light. Initially, these were “bands” covering some classics in country music. However, a closer look at the bands unearthed a wider range of music covers across different genres and across decades. According to the Redditors, none of these bands have originals, which raised more suspicion.

“Apparently this has been going on for several years, with ambient music and with electronic music and jazz,” said Culibuildr, a Redditor who posted the original thread and asked to be identified by their handle.

“I think the new thing here is that with AI being this consumer product, anybody can make a thing with vocals now.”

Culibuildr.

Without being alerted, Culibuildr said it would be difficult for a listener to easily tell this was an AI cover.  A lawyer for 11A, a label that claimed to be representing some “artists” of some of this music said he had documents to show the involvement of human musicians in the production of the covers.

See also Meta Platforms secures geothermal energy deal with Sage Geosystems to power U.S. data centers

The lawyer however could not respond to further questions by Slate and failed to offer contact details for the label with the only trace being an expired domain , a Facebook profile with only 117 followers and last posted in 2021. This, according to Slate is another red flag for a label working in online business to have such a low digital footprint.

Exploitative tendencies increasing in the streaming business

A Spotify spokesperson insinuated content providers were responsible for the anomaly and the removal of the covers from the platform after it was red-flagged. Content providers could be anyone on the artists’ side, for instance, the “bands” themselves, their management, and the label. The Spotify spokesperson said in a statement that:

“Spotify does not have a policy against artists creating content using autotune or AI tools, as long as the content does not violate our other policies, including our deceptive content policy , which prohibits impersonation.”

An entertainment and music lawyer Cole Henderson revealed that this could be the work of third-party intermediaries that many artists use to upload and manage their music on streaming platforms.

“I think whoever actually distributes this might be nervous about the reports on it and they might have taken it down,” he said.

“People are finding better ways to exploit the streaming system, because technically, this isn’t streaming fraud. If they’re paying somebody to perform cover songs and then using covers to pull streams, that’s not illegal, it’s just exploitative.”

Cole Henderson .

According to the lawyer, the exploitation is mainly on the original songwriter, who also gets royalties but not as much as they would prefer and therefore unhappy with the craze where AI is performing their songs.

See also California senator criticizes OpenAI for opposing AI safety bill

According to Cole, a company like Spotify pays a percentage of royalties to the record label, which is then divided among the parties involved in making the song.

However, for a cover song, this would include a cut for whoever is being covered. Streaming services however usually offer more money to the new performer of the song and their label as opposed to the original artists.

0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

PoolX: Earn new token airdrops
Lock your assets and earn 10%+ APR
Lock now!

You may also like