The social media world will be subject to a Supreme Court battle
The Case of Gonzales v. Google: How Section 230 is a Fate for Websites and Why People May Think It Isn’t
The Court could decide in Gonzalez v. Google to punt the fate of Section 230 to Congress by rejecting the Gonzalez complaint. If it does, future bills to carve out liability for extremist content would likely replicate prior legislation that imposes content liability on social media companies for creating content that, among other things, violates federal criminal law, violates intellectual property claims, constitutes child pornography or promotes sex trafficking.
The case of Gonzalez v. Google went before the nine justices on February 21 and they heard oral arguments. The outcome of the case could decide the future of social media platforms worldwide.
And more could be coming: the Supreme Court is still mulling whether to hear several additional cases with implications for Section 230, while members of Congress have expressed renewed enthusiasm for rolling back the law’s protections for websites, and President Joe Biden has called for the same in a recent op-ed.
The law’s main provision is that websites can’t be considered publishers, speakers or authors of other people’s content. In plain English, that means that any legal responsibility attached to publishing a given piece of content ends with the person or entity that created it, not the platforms on which the content is shared or the users who re-share it.
In recent years, however, critics of Section 230 have increasingly questioned the law’s scope and proposed restrictions on the circumstances in which websites may invoke the legal shield.
The executive order faced a number of legal and procedural problems, not least of which was the fact that the FCC is not part of the judicial branch; that it does not regulate social media or content moderation decisions; and that it is an independent agency that, by law, does not take direction from the White House.
Starting in 2018, prominent conservatives began demanding changes to the law that would expressly hinge Section 230’s liability protections on how companies treat political speech. High-profile Republicans, including Missouri senator Josh Hawley and Texas senator Ted Cruz, frequently misconstrued the section’s language. Cruz was interpreting the law as shielding only websites that treat left- and right-wing political views equally, which made him think it was a neutral public forum.
The result is a hatred for Section 230 even if both parties can’t agree on what to do with it.
While this week’s oral arguments won’t be the end of the debate over Section 230, the outcome of the cases could lead to hugely significant changes the internet has never before seen — for better or for worse.
Tech critics have called for added legal exposure and accountability. The massive social media industry is largely shielded from courts and the normal development of a body of law. In a Supreme Court brief, the Anti-Defamation League pointed out that it is irregular for a global industry to be protected from judicial inquiry.
For the tech giants, and even for many of Big Tech’s fiercest competitors, it would be a bad thing, because it would undermine what has allowed the internet to flourish. It would potentially put many websites and users into unwitting and abrupt legal jeopardy, they say, and it would dramatically change how some websites operate in order to avoid liability.
What Happened When Social Media Users Came Through? The Laws and Practices of Recommending on Social Media, according to Eric Schnapper
Recommendations are the very thing that makes it a vibrant place, stated the company. “It is users who upvote and downvote content, and thereby determine which posts gain prominence and which fade into obscurity.”
People would stop using Reddit, and moderators would stop volunteering, the brief argued, under a legal regime that “carries a serious risk of being sued for ‘recommending’ a defamatory or otherwise tortious post that was created by someone else.”
How the court rules could be a gamechanger for American law, society, and social media platforms that are some of the most valuable businesses in the world.
Representing the terrorism victims against Google and Twitter, lawyer Eric Schnapper will tell the Supreme Court this week that when Section 230 was enacted, social media companies wanted people to subscribe to their services, but today the economic model is different.
A way to make more money is by being on social media, he says, and one way to do that is by using algorithms that suggest other related content to keep users online longer.
What’s more, he argues, modern social media company executives knew the dangers of what they were doing. They met with officials from the government who told them of the dangers of the video, how it was used for recruitment and how to dispose of it.
The head of the FBI, the director of national intelligence, and the then-White House chief of staff are some of those people. Those people in the government. He says he told them that.
The Supreme Court had a Problem with Social Recommendations: What Happened in the 1990s, When Social Media First Started?
She says that the company has invested in human review and smart detection technologies to make sure there isn’t a place for extremism on its products.
Prado acknowledges that social media companies today are nothing like the social media companies of 1996, when the interactive internet was an infant industry. She asserts that the courts should not change the law if there is a change.
“Congress had a really clear choice in its mind,” he says. “Was the internet going to be like the broadcast media that were pretty highly regulated?” Or, was it going to be like “the town square or the printing press?” He said that Congress grabbed the town square and the printing press. But, he adds, that approach is now at risk: “The Supreme court now really is in a moment where it could dramatically limit the diversity of speech that the internet enables.”
There are many “strange bedfellows” among the tech company allies in this week’s cases. The Chamber of Commerce, the libertarian American Civil Liberties Union and several other groups have filed briefs urging the court to leave the status quo in place.
But the Biden administration has a narrower position. Columbia law professor Timothy Wu summarizes the administration’s position this way: “It is one thing to be more passively presenting, even organizing information, but when you cross the line into really recommending content, you leave behind the protections of 230.”
In short, hyperlinks, grouping certain content together, sorting through billions of pieces of data for search engines, that sort of thing is OK, but actually recommending content that shows or urges illegal conduct is another.
If the Supreme Court were to adopt that position, it would be very threatening to the economic model of social media companies today. The tech industry does not have an easy way to tell between recommendations and aggregation.
That would likely mean that the companies were defending themselves in court. But filing suit, and getting over the hurdle of showing enough evidence to justify a trial–those are two different things. The Supreme Court made it very difficult to jump that hurdle. There is a problem that the court hears about on Wednesday.
There was a ripple of laughter in the US Supreme Court on February 21 when Justice Elena Kagan said: “We are a court—we really don’t know about these things. We aren’t as good as the nine experts on the internet.
The ban of Alex Jones, host of the right-wing Infowars website, which was later ordered to pay over $1 billion in damages by a jury, is an example of the supposedly “biased” enforcement.
Editor’s Note: Former Amb. The Coalition for a Safer Web is a non-profit organization that works to develop technology and policies to make it easier to remove hate speech from social media platforms. The views expressed in this commentary are his own. View more opinion on CNN.
The justices asked if the court was leaning in favor of the defense, since it was not clear if the court was siding with the aid provided to the terrorist group by the social network.
A Chasing Life with TikTok: Social Media Adversarial Crimes and the Role of Social Media in Children’s Health
In other words, the platforms are considered benign providers of digital space and have limited liability exposure for customers who decide to use them. The idea was that new internet service companies would likely face financial ruin from the lawsuits against them for publishing libelous content.
But things have changed since those early days of the internet. There are antisemitic extremists and far right terrorists. They were used to encourage attacks in the US. American people are paying the price of lost lives.
The crux of the problem is that social media companies earn revenue from digital advertising. Corporate advertisers pay premiums for placing amplified ads, digital ads that a platform’s program posts to a user account and draws the attention of a large number of other users, on content targeted to users with the same online interests. Social media companies make more money when users engage in extremist content than if they only focus on other interests.
The Chasing Life podcast explores how ordinary citizens are trying to hold Big tech accountable for their content.
TikTok CEO Shou Chew faced grilling from lawmakers Thursday when he appeared before the House Energy and Commerce Committee. His testimony took place as a number of legislators renewed their call for TikTok to be banned in the US due to its links to China through its parent company, ByteDance. They questioned TikTok’s data collection practices and the impact on children.
While these events play out on the national stage, the tug-of-war over content consumption is also taking place on a smaller scale in homes and schools across the country, affecting the lives and mental health of some users, especially young people.
College student Emma Lembke is taking action against what she sees as the harmful effects of social media on her generation. By her mid-teens, the founder of the Log Off Movement had had enough of social media.
“I remember I heard the buzzer, my phone, probably Snapchat notification, something trying to pull me in, and I instantly had that Pavlovian response to grab for it,” Lembke recalled. I hit my breaking point when I caught that buzz in the middle of my grab. I wondered how I allowed these apps to have so much control over me.
Lembke was last among her friends to be allowed on social media. She thought that world must be “mystical and magical and golden,” given her friends’ newfound and intense focus.
She said she remembers seeing all of her friends pull away from her, while she was having a conversation. It felt like a drop. … Each one would spend more and more time sucked into their phones and screens rather than talking with me in person.”
“I first got my social media accounts at the age of 12, in the sixth grade, starting with Instagram and making my way over the years to other apps and platforms like Snapchat,” she said. “But as I, a 12-year-old girl, began to spend more time on these apps, my mental and physical health really suffered.”
“I was scrolling, and through scrolling, constantly quantifying my worth through likes and comments and followers,” she said. It deepened my social anxiety and my depression.
The opaque algorithms of these platforms also pushed her down a dark road of unrealistic body standards, she said, leading her into harmful eating habits.
“I think the most nefarious aspect to it all … is that it’s not very overt. It’s not going to say, ‘Get an eating disorder, feel bad about your body, go home and, like, don’t eat anything for a day.’ She said it will not be that blunt. “What it will do is it will slowly ease you into content that repeatedly reinforces those standards and those practices without overtly saying it.”
As a young person, what are you up against? An entire societal norm says to get on social media.
What Lembke’s Going Through: The Impact of Social Media and Big Tech on a Young Persons’ Mental Health (with a Commentary on Chasing Life)
Today, Lembke is urging that Big Tech and social media companies be held accountable, testifying before the Senate Judiciary Committee in February about the effect of these platforms on her life.
You can hear more about Lembke in this week’s Chasing Life, she is an advocate for the mental health of young people. CNN tech reporter Brian Fung explains the Section 230 and what is at stake with any upcoming Supreme Court ruling.