Music labels are being used by YouTube to figure out its artificial intelligence strategy

Is it Okay to Use Artificial Intelligence on YouTube? The case of Drake on Tiktok and TikTok in the Music Industry

The Financial Times reported that UMV and YouTube are in negotiations to license voices and melodies to train artificial intelligence models.

The platform said in a blog post it would invest in building its rights management system Content ID, update its policies on uploading manipulated content, and deploy generative AI tools to help detect videos that violate its rules.

Lucian Grainge said that “central to our collective vision is taking steps to build a Safe, responsible, and profitable ecosystem of music and video.”

An AI-generated song, “Heart On My Sleeve,” went viral on Tiktok in April, featuring vocals by what sounded like Drake and The Weeknd. It was soon making its way to streaming services. UMG, Drake’s music label, issued a strongly worded statement saying AI-generated songs violate copyright laws. The song was eventually taken down from YouTube.

There is absolutely nothing more important than making sure that his estate and label get paid when people do fake versions of his songs on the internet, right? Even if that means creating an completely new class of extralegal contractual royalties for major music labels just to protect the online dominance of your video platform and simultaneously insist that training artificial intelligence search results on books and news websites without paying anyone is permissible fair use? Right? Right?

There is a chance that artificial intelligence chokes out the web, both by flooding user-generated platforms with garbage and also by ruining Google’s search results so badly that they have little choice but to sign lucrative content deals.

The only solution that the music industry will accept is a solution that is toothless artificial intelligence. It’s creating a new royalty system for using artists’ voices that does not exist in current copyright law. UMG wants you to get paid if you make a video with a voice that sounds like Drake.

If you’re using copyrighted material for non-profit purposes, you have to admit you made the copy in the first place and that it is an affirmative defense to copyright infringement, because the courts can take a long time to evaluate a case.

No one wants to go back to the labels going after individual parents for their kids dancing in a video, so that’s why they have to keep the music industry happy. And there’s no way for YouTube Shorts to compete with TikTok without expansive music rights, and taking those off the table by ending up in court with the labels is a bad idea.

And the problems here aren’t hard to predict: right now, Content ID generally operates within the framework of intellectual property law. If you make something — a piece of music criticism, say — flagged by Content ID as infringing a copyright and you disagree with it, YouTube never steps in to resolve it but instead imposes some tedious back-and-forth and then, if that doesn’t work out, politely suggests you head to the courts and deal with it legally. (YouTubers generally do not do this, instead coming up with an ever-escalating series of workarounds to defeat overzealous Content ID flags, but that’s the idea.)

Mohan sandwiched that announcement in between saying that YouTube will be expanding its content moderation policies to cover the challenges of AI and that there will be a bunch of UMG artists and producers, including the estate of Frank Sinatra. Instead, we were told that the solution to the technology problem was more technology.

We will continue to invest in Artificial Intelligence technology that helps protect our community of creators, viewers and artists, from Content ID, to policies and detection and enforcement systems, that keep our platform safe behind us. Sure.

The only thing that is clear about these looming AI copyright cases is that they have the potential to upend the internet as we know it, copyright law itself, and potentially lead to a drastic rethinking of what people can and cannot do with the art they encounter in their lives. The social internet came up in the age of Everything is a Remix; the next decade’s tagline sounds a lot like “Fuck You, Pay Me.”

Bringing Music to the Fore: The Case For YouTube and AI Copyright and the UMG Music Licensing Union (Extended Abstract)

In April of this year, when Drake was making a lot of noise on the internet, Michael Nash, the UMG’s digital strategy boss, explicitly said that the music company would be issuing takedowns for the song based on the Metro Boomin sample.

The future of the upcoming set of lawsuits from a cast of characters, including Sarah Silverman, will definitely be working an angle because it isn’t certain whether the data used to train the models is fair use or not. (A reminder that human beings are not computers: yes, you can “train” your brain to write like some author by reading all their work, but you haven’t made any copies, which is the entire foundation of copyright law. Stop it.)

Let’s say YouTube extends this new extralegal private right to likenesses and voices to everyone. What happens to Donald Trump impersonators in an election year? How about Joe Biden’s impressionists? Where will YouTube draw the line between AI Drake and AI Ron DeSantis? How will the pressure to remove any impressions of DeSantis from the internet be able to be overcome after opening the door to removing Frank Sinatra? Are we prepared for that or just concerned about losing our music rights?

Source: Google and YouTube are trying to have it both ways with AI and copyright

The Search Generative Experience (SGE): How search gets done and why it will take a lot of time and effort to roll it all

At this moment in web history, Google is the last remaining source of traffic at scale on the web, which is why so many websites are turning into AI-written SEO honeypots. The situation is bad and getting worse.

The Search Generative Experience, also known as SGE, is a new feature that will enable searches to be answered using artificial intelligence, including lucrative queries about buying things. In fact, almost every SGE demo Google has ever given has ended in a transaction of some kind.

This is a great deal for Google but a horrible deal for publishers, who are staring down the barrel of ever-diminishing Google referrals and decreasing affiliate revenue but lack any ability to say no to search traffic. On the last earnings call, Sundar Pichai bluntly said of SGE that over time, it will just be how search works.

A website can block the crawlers in a robots.txt file, but it means that you won’t be found in search, even though it’s illegal to do that on a website.

This will all take a lot of time! And it behooves Google to slow roll it all while it can. For example, the company is thinking about creating a replacement for robots.txt that allows for more granular content controls but… you know, Google also promised to remove cookies from Chrome in January 2020 and recently pushed that date back yet again to 2024. A lumbering web standards process taking place in the background of an apocalyptic AI fair use legal battle is just fine if no one can turn off your crawler in the meantime!

You know what? The future version of Google is very similar to the current version ofYouTube, a cable network where a flood of user content is next to lucrative deals with TV networks, music labels, and sports leagues. If you squint, it is the exact kind of walled garden upstarts like Google once set out to disrupt.

Previous post How to watch the GOP debate
Next post The disaster map is being changed by climate change