5th Circuit ruling on Texas social media law has tech companies thinking

Spread the love

At some point in the future, Texans who visit social media sites might be greeted with a pop-up screen saying something like: “The content you are about to see contains graphic violence, white supremacist imagery and other objectionable material. If you don’t want to be exposed, click here.”

The pop-up is among a slew of options companies are weighing in response to a Texas social media law that was upheld by the U.S. Court of Appeals for the Fifth Circuit last month. Most of the options being floated would alter tech company services so dramatically that some experts have concluded they would be virtually impossible to execute, say lobbyists that work with the companies.

Proponents of the Texas law, and a similar one in Florida, have said the legislation will prevent tech companies from engaging in censorship by banning them from taking down posts featuring political viewpoints that they disagree with. But the wording of the Texas law effectively bars the companies from moderating or blocking any content that is not already illegal, paving the way, experts say, for terrorist recruiting, white supremacist organizing, posts egging on people with eating disorders, vaccine disinformation, and other harmful material that many websites currently ban.

Though the laws in both states are products of conservative lawmakers, the Fifth Circuit’s decision on the Texas law contradicts some long-standing Supreme Court opinions supporting First Amendment protections for corporations — opinions that conservatives at one time hailed. It also stands in contrast to a ruling in May from the U.S. Court of Appeals for the 11th Circuit striking down a similar Florida law. The conflict means the law probably will be considered by the U.S. Supreme Court, where conservative justices have repeatedly supported corporations’ First Amendment rights in cases such as Citizens United, a 2010 ruling that upended long-standing limits on corporate campaign contributions that the court said restricted corporations’ rights to engage in political speech.

Despite their hope that the Supreme Court ultimately will reject the law, Silicon Valley companies are starting to prepare for worst-case scenarios, gaming out responses in planning exercises called “sandboxing,” said Carl Szabo, vice president and general counsel for NetChoice, one of the tech company lobbying groups that has challenged the Texas law. The group’s members include Meta, TikTok, Google, Nextdoor, and dozens of other services.

See also  X Moves Forward With New ID Confirmation Elements to Tackle Bots

Appeals court upholds Texas law regulating social media moderation

The strategizing falls into four general areas, the most radical of which includes the possibility of the companies shutting down their services entirely in Texas and potentially any other states where copycat bills have been introduced.

Tech companies could also build the “pop-up screens” that would greet users, letting them know that the material they are about to see could be highly disturbing and giving them the option to opt-in to a more moderated environment, said Daphne Keller, director of the Program on Platform Regulation at the Cyber Policy Center at Stanford University.

Companies also have explored the risky proposition of stopping all moderation — essentially complying with the law to a T — and waiting for mass public protest or for people to flee their products. And some have floated the idea of “lobotomizing” the content on their services, making it so fluffy that there is no grounds for removing anything, said Matt Schruers, president of the Computer & Communications Industry Association (CCIA), the other tech industry group fighting the law.

“The unifying factor in all these options is utter confusion,” Schruers said.

Szabo said that technology companies had “actually sat down and tried to figure out how to implement the Texas law,” but that right now most of the possibilities seemed impossible to implement, legally questionable, or would have the effect of costing them tens of millions of customers.

“Some of the greatest technical minds on the planet have come together, but they can’t make it work because what Texas and Florida are essentially doing is asking platforms to square a circle,” he said.

The experts likened the law to forcing Barnes & Noble bookstores to host copies of Adolf Hitler’s Mein Kampf manifesto, or requiring newspapers such as The Washington Post to publish op-eds by self-proclaimed neo-Nazi candidates.

Tech companies built their capacity to remove, demote, and moderate content on their services reluctantly, at first doing the bare minimum to comply with laws in the U.S. that prohibit services from hosting copyrighted material or child pornography, and with laws in Europe that ban pro-Nazi speech. In its early years, Facebook tried to distinguish itself from its then-competitor Myspace by setting for itself a higher bar of appropriateness, banning outright nudity and speech that called for violence, for example, and hiring a small number of moderators to enforce its rules.

See also  How to Investigate the Vicinity in Genshin Impact

But the company soon ran into the complexities of content moderation when it mistakenly took down a famous Vietnam War photo of a nude girl running from napalm bombs dropped by South Vietnamese planes. After protests, the company restored the photo and added an exception for newsworthiness to its policies banning nudity.

In 2017, social media companies in Silicon Valley were hauled in front of Congress to account for revelations that Russian operatives had sowed widespread disinformation on their services in the presidential election the previous year. In response, companies like Facebook and Google-owned YouTube hired tens of thousands of moderators, essentially giving birth to a content moderation industry overnight. With each new rule, the tech companies hired more moderators and built software to screen for potentially problematic content.

The pandemic brought more rules and more takedowns by people and by algorithms, as companies banned vaccine misinformation, such as posts opposing masks or hawking false cures.

The content moderation boom reached an inflection point after the Jan. 6, 2021, riot at the U.S. Capitol, when tech companies banned former president Donald Trump’s social media accounts. Trump’s banning prompted a conservative backlash, leading to the laws in Florida and Texas.

Concerns that social media sites were too slow to move against misinformation and calls to violence also have prompted liberal legislative responses. A California law passed last month requires platforms to make twice-annual filings with the state’s attorney general spelling out their content moderation policies regarding hate speech, disinformation and extremism.

New California law likely to set off fight over social media moderation

There are no similar federal laws.

Because the Texas law applies to any tech service with more than 50 million users, experts say it would also cover companies that have nothing to do with political speech, such as Pinterest, Etsy, and Yelp. Those companies are in an even tougher position than the large platforms because they don’t have the financial wherewithal to resist all the challenges they might face under the law, said Alex Feerst, former head of legal for the social-media platform Medium and a consultant for tech companies on content moderation issues.

In theory, the law, he said, could prevent a company like Etsy from removing pro-Nazi statements posted as part of an offer for a custom baby crib. It also allows anyone to bring a lawsuit on the grounds that they were discriminated against, subjecting medium-sized companies to a wave of litigation that could be crippling.

See also  How To Fix YouTube Video Scroll Down When You Press Space Bar

“It’s a nail-biter for smaller companies because they don’t have the resources that large companies do, but still they could be sued by anyone,” Feerst said.

Keller said that some of the options tech companies are weighing would be a minefield to navigate — technically, legally, and in terms of impact on a company’s business.

The strategy of shutting down service in only one state could be technically challenging and would be massively costly, since Texas is the country’s second most-populous state (Florida is third). It also would be challenging for companies to detect whether a Texas resident is signing in from another state.

The pop-up option might not be legally enforceable because officials in Texas could argue that users aren’t truly giving consent to moderation, Szabo said.

Removing all political material from a social-media service would probably not work because just about anything could be construed as a political viewpoint, Schruers said.

Experts said the assumption that the court would strike down the law also is risky in the wake of the Dobbs verdict that overturned the landmark abortion ruling Roe v. Wade. Even a Supreme Court decision that struck down some aspects of the law but allowed other parts to go into effect would send shock waves through Silicon Valley.

Keller said a result that left some parts of the law intact would dramatically alter how technology and media companies do business, perhaps causing them to rewrite all the algorithms that serve content, fire thousands of moderators, and upend their practices for policing speech.

“There’s a very turbulent legal landscape ahead,” she said. “It’s like Dobbs in that everyone feels that the law is up for grabs, that justices will act on their political convictions and would be willing to disregard precedent.”

Source link

Neon-lit text graphic reading "social schmuck" with a retro style.
Website | + posts
  • Related Posts

    Java Burn Review – Drink coffee and lose weight

    Spread the love

    Spread the loveJava Burn Review This revolutionary dietary supplement, designed to turbocharge your coffee routine, sets a new weight loss and fat-burning standard. With a carefully selected blend of all-natural…

    Read more

    Musk’s ‘Everything App’ Plan Faces Doubt from Meta’s History

    Spread the love

    Spread the love Elon Musk’s ambitious vision to transform X into an “everything app” centers around integrated payment solutions designed to enhance user experience by offering a variety of transactional…

    Read more

    You Missed

    Java Burn Review – Drink coffee and lose weight

    Java Burn Review – Drink coffee and lose weight

    Musk’s ‘Everything App’ Plan Faces Doubt from Meta’s History

    Musk’s ‘Everything App’ Plan Faces Doubt from Meta’s History

    Police Body Cam Reveals Kim Zolciak, Kroy Biermann’s Theft Accusations

    Police Body Cam Reveals Kim Zolciak, Kroy Biermann’s Theft Accusations

    Claw Gaming Handhelds Feature Lunar Lake Processors and AI

    Claw Gaming Handhelds Feature Lunar Lake Processors and AI

    Little Man Gallery Codes: Complete List and Guide

    Little Man Gallery Codes: Complete List and Guide

    Threads for Creators Hub Launched by Meta

    Threads for Creators Hub Launched by Meta

    James Van Der Beek’s Cancer Diagnosis Bombshell Revealed

    James Van Der Beek’s Cancer Diagnosis Bombshell Revealed

    DOGE to ‘Scrutinize’ Federal Loans to Tesla’s Rivals, Says Ramaswamy

    DOGE to ‘Scrutinize’ Federal Loans to Tesla’s Rivals, Says Ramaswamy

    Pocket Battle Gift Codes: Exclusive List of Rewards

    Pocket Battle Gift Codes: Exclusive List of Rewards

    Debbie Nelson, Eminem’s Mother, Dies at 69

    Debbie Nelson, Eminem’s Mother, Dies at 69

    java burn weight loss with coffee

    This will close in 0 seconds