YouTube is about to start cracking down on Artificial Intelligence clones
TikTok vs. Meta: Distinction from Artificial Intelligence-Generated Content on YouTube for Politics, Politics and Social Issues
Meta, the owner of Facebook andInstagram, will require advertisers to disclose use of artificial intelligence in their ads about elections, politics and social issues. The company has also barred political advertisers from using Meta’s own generative AI tools to make ads.
TikTok forbids the creation of deepfakes of young people and private figures, and requires artificial intelligence-generated content to be labeled. AI-generated content depicting public figures are allowed in certain situations, but can’t be used in political or commercial endorsements on the short-form video app.
That’s the explicit distinction laid out today in a company blog post, which goes through the platform’s early thinking about moderating AI-generated content. The basics are fairly simple: YouTube will require creators to begin labeling “realistic” AI-generated content when they’re uploading videos, and that the disclosure requirement is especially important for topics like elections or ongoing conflicts.
Some videos dealing with sensitive topics like elections, public health crises or public officials, will have more prominent artificial intelligence labels on them.
From there, it gets more complicated — vastly more complicated. YouTube will allow people to request removal of videos that “simulate an identifiable individual, including their face or voice” using the existing privacy request form. So if you get deepfaked, there’s a process to follow that may result in that video coming down — but the company says it will “evaluate a variety of factors when evaluating these requests,” including whether the content is parody or satire and whether the individual is a public official or “well-known individual.”
The proliferation of generative AI technology, which can create lifelike images, video and audio sometimes known as “deepfakes,” has raised concerns over how it could be used to mislead people, for example by depicting events that never happened or by making a real person appear to say or do something they didn’t.
YouTube and the Art of Fair Use in the Music Industry : a Reply to Malon on YouTube vs. Complaints
At the same time, YouTube parent company Google is pushing ahead on scraping the entire internet to power its own AI ambitions — resulting in a company that is at once writing special rules for the music industry while telling everyone else that their work will be taken for free. The tension is only going to keep building — and at some point, someone is going to ask Google why the music industry is so special.
This special protection for singing and rapping voices won’t be a part of YouTube’s automated Content ID system when it rolls out next year; Malon tells us that “music removal requests will be made via a form” that partner labels will have to fill out manually. Malon says that the platform won’t penalize creators who violate these blurred lines by removing content for either a privacy request or synthetic vocals request.
There are entire channels dedicated to churning out AI covers by artists living and dead, and under YouTube’s new rules, most would be subject to takedowns by the labels. The only exception YouTube offers in its blog post is if the content is “the subject of news reporting, analysis or critique of the synthetic vocals” — another echo of a standard fair use defense without any specific guidelines yet. YouTube has long been a generally hostile environment for music analysis and critique because of overzealous copyright enforcement, so we’ll have to see if the labels can show any restraint at all — and if YouTube actually pushes back.
It is going to be wildly complicated — there’s no definition of “parody and satire” for deepfake videos yet, but Malon again said there would be guidance and examples when the policy rolls out next year.
The videos themselves will have labels on top of them for sensitive material. Jack Malon, a representative of YouTube, told us that the company would give more detailed guidance on how to comply with the requirement, which is due next year.