Techno Blender
Digitally Yours.

Big Tech’s legal bulldog at the center of Supreme Cour

0 21



All across the country, lawsuits are piling up challenging state laws that attempt to regulate Big Tech platforms in one way or another. Some of the suits take aim at laws requiring new safeguards for kids online. Others are targeting attempts to govern how social media platforms can moderate political speech. All of the litigation has at least one thing in common: the Big Tech lobbying firm behind it. 

Over the past few years, NetChoice, which was founded in 2001, has emerged as the tech industry’s first line of defense against state-level regulation. Since 2022, NetChoice has filed lawsuits against online child safety laws in California, Ohio, Arkansas, and Utah. It also joined with another tech industry group, CCIA, to challenge Texas and Florida’s social media “censorship” laws, both of which would limit how social media platforms moderate content. On Monday, the lawsuits against Texas and Florida—Netchoice v. Paxton and Moody v. Netchoice, respectively—will head to the Supreme Court, where the justices will determine whether the states can force social media platforms to publish certain political speech or whether doing so would violate the platforms’ own First Amendment rights.

The flurry of legal activity has put a spotlight on NetChoice, which, according to Bloomberg, saw its revenue grow from $3 million in 2020 to $34 million in 2022. Fast Company spoke with the firm’s vice president and general counsel, Carl Szabo, about what’s at stake in next week’s case—and the many others NetChoice is fighting across the country. This interview has been edited for length and clarity.

Looking ahead to next week’s arguments in the Texas and Florida social media cases, what do we know about how the Supreme Court justices view this issue?

We have a Supreme Court decision from just last term, which gives us an idea of where this case is going and how this case is going to shake out. That case is 303 Creative v. Elenis, and in that case, you had a website designer being forced by the state of Colorado to create and host content that they didn’t want to host. [Editor’s note: The designer did not want to make wedding websites for same-sex couples] You saw every conservative judge—all six of them—say the First Amendment is pretty clear on this: The state of Colorado cannot force a private business to say something it does not want to say. The liberal justices on the court ruled in the dissent, but their basis was that this was a protected class. It was about civil rights.

In the cases of Florida and Texas, it’s all about political speech. [Politicians] are not a protected class. I think it’s pretty easy to see how the liberal justices will also rule in favor of NetChoice and CCIA. Because it’s not just 303 Creative. We have about 200 years of Supreme Court decisions to back up our position. At the end of the day, if the state can force you to carry political speech against your will, the First Amendment has failed. 

If NetChoice loses, what are the immediate implications for tech companies operating in Texas and Florida?

They can’t just turn off a light switch to the states of Florida and Texas. Texas, in particular, has a provision in its law that says [platforms] still have to provide service. That means all of the websites would have to show really horrible, vile content. They essentially have to begin turning off their content moderation systems. I don’t know how you do that and keep the lights on, because users will run away in droves, and advertisers will run away in droves. YouTube had many advertisers leave the platform because ads were inadvertently being shown alongside controversial content. X is another example where you’ve seen advertisers leave because of the way the site is being run. 

Some of the other cases NetChoice has brought in California, Arkansas, Utah, and Ohio have to do with restricting minors’ access to social media platforms, including by verifying users’ ages. What are the core arguments against that kind of legislation?

They are unconstitutional. I have three different federal judges all saying that the laws in California, Arkansas, and Ohio are unconstitutional and violative of the First Amendment. Teenagers have First Amendment protection. This amendment doesn’t stop just because you’re under the age of 18. The First Amendment applies to us all.

These laws ultimately require collection of information to verify your age. Some of the suggestions in states have been: Oh, just hand over your driver’s license. It’s a violation of anonymity on the internet. This is essentially a ban on speech, a ban on access to information. 

What type of legislation would NetChoice support to protect kids online?

Two examples of laws that are in place today, which I wish were in place in my home state of Maryland, are laws from the states of Virginia and Florida, which require digital education as part of the school course curriculum.

Another thing that we are supportive of and helping with is closing loopholes on existing law with regard to artificial intelligence and child sexual abuse material. Today our child sexual abuse material laws, for the most part, do not cover artificial intelligence. So we’re working with lawmakers to close those gaps. We are also supportive of and pushing for federal legislation that would give law enforcement more resources to put child abusers behind bars.

At the same time, we are actively and aggressively pushing for a comprehensive federal privacy bill, because that’s something that is good for everyone. I do think comprehensive privacy legislation is going to be a very big one.

Isn’t that what people say every year?

Here’s the reason: [House Energy and Commerce Committee] Chairwoman Cathy McMorris Rodgers has made it one of her main goals, and she’s just announced retirement. This is her last chance to get it done. 

What do you make of efforts by Meta and others to argue that it’s app store owners like Apple that need to provide a way for parents to verify their kids’ ages on their phones?

It’s an interesting idea, but I don’t think it’s a solution. The solution is engaging directly with parents. When it comes to parenting, you have to be engaged. You have to set limits yourself, not the device. That’s what being a parent is.  

Everyone freaked out that video games are going to ruin our kids. Prior to that, it was rap music. Before that, it was television. Before that, it was radio and Elvis Presley. The thing that makes me constantly optimistic is that parents are smart. Parents are capable. And, unfortunately, there are too many people out there saying parents are not able to do it. 

I think that’s a little too easy. Most parents I know are not just handing their kids a phone and saying, “Have at it.” We’re talking about what more parents can be doing. We’re talking about what more lawmakers can do to close loopholes that exist. But what more can the companies be doing?

They can always do more. And they always are doing more. Even if you don’t think that these companies have an ounce of decency in their soul and all they care about is money, then it’s important to recognize that they would still try to make their sites safe for everyone, because that’s good for business. 





All across the country, lawsuits are piling up challenging state laws that attempt to regulate Big Tech platforms in one way or another. Some of the suits take aim at laws requiring new safeguards for kids online. Others are targeting attempts to govern how social media platforms can moderate political speech. All of the litigation has at least one thing in common: the Big Tech lobbying firm behind it. 

Over the past few years, NetChoice, which was founded in 2001, has emerged as the tech industry’s first line of defense against state-level regulation. Since 2022, NetChoice has filed lawsuits against online child safety laws in California, Ohio, Arkansas, and Utah. It also joined with another tech industry group, CCIA, to challenge Texas and Florida’s social media “censorship” laws, both of which would limit how social media platforms moderate content. On Monday, the lawsuits against Texas and Florida—Netchoice v. Paxton and Moody v. Netchoice, respectively—will head to the Supreme Court, where the justices will determine whether the states can force social media platforms to publish certain political speech or whether doing so would violate the platforms’ own First Amendment rights.

The flurry of legal activity has put a spotlight on NetChoice, which, according to Bloomberg, saw its revenue grow from $3 million in 2020 to $34 million in 2022. Fast Company spoke with the firm’s vice president and general counsel, Carl Szabo, about what’s at stake in next week’s case—and the many others NetChoice is fighting across the country. This interview has been edited for length and clarity.

Looking ahead to next week’s arguments in the Texas and Florida social media cases, what do we know about how the Supreme Court justices view this issue?

We have a Supreme Court decision from just last term, which gives us an idea of where this case is going and how this case is going to shake out. That case is 303 Creative v. Elenis, and in that case, you had a website designer being forced by the state of Colorado to create and host content that they didn’t want to host. [Editor’s note: The designer did not want to make wedding websites for same-sex couples] You saw every conservative judge—all six of them—say the First Amendment is pretty clear on this: The state of Colorado cannot force a private business to say something it does not want to say. The liberal justices on the court ruled in the dissent, but their basis was that this was a protected class. It was about civil rights.

In the cases of Florida and Texas, it’s all about political speech. [Politicians] are not a protected class. I think it’s pretty easy to see how the liberal justices will also rule in favor of NetChoice and CCIA. Because it’s not just 303 Creative. We have about 200 years of Supreme Court decisions to back up our position. At the end of the day, if the state can force you to carry political speech against your will, the First Amendment has failed. 

If NetChoice loses, what are the immediate implications for tech companies operating in Texas and Florida?

They can’t just turn off a light switch to the states of Florida and Texas. Texas, in particular, has a provision in its law that says [platforms] still have to provide service. That means all of the websites would have to show really horrible, vile content. They essentially have to begin turning off their content moderation systems. I don’t know how you do that and keep the lights on, because users will run away in droves, and advertisers will run away in droves. YouTube had many advertisers leave the platform because ads were inadvertently being shown alongside controversial content. X is another example where you’ve seen advertisers leave because of the way the site is being run. 

Some of the other cases NetChoice has brought in California, Arkansas, Utah, and Ohio have to do with restricting minors’ access to social media platforms, including by verifying users’ ages. What are the core arguments against that kind of legislation?

They are unconstitutional. I have three different federal judges all saying that the laws in California, Arkansas, and Ohio are unconstitutional and violative of the First Amendment. Teenagers have First Amendment protection. This amendment doesn’t stop just because you’re under the age of 18. The First Amendment applies to us all.

These laws ultimately require collection of information to verify your age. Some of the suggestions in states have been: Oh, just hand over your driver’s license. It’s a violation of anonymity on the internet. This is essentially a ban on speech, a ban on access to information. 

What type of legislation would NetChoice support to protect kids online?

Two examples of laws that are in place today, which I wish were in place in my home state of Maryland, are laws from the states of Virginia and Florida, which require digital education as part of the school course curriculum.

Another thing that we are supportive of and helping with is closing loopholes on existing law with regard to artificial intelligence and child sexual abuse material. Today our child sexual abuse material laws, for the most part, do not cover artificial intelligence. So we’re working with lawmakers to close those gaps. We are also supportive of and pushing for federal legislation that would give law enforcement more resources to put child abusers behind bars.

At the same time, we are actively and aggressively pushing for a comprehensive federal privacy bill, because that’s something that is good for everyone. I do think comprehensive privacy legislation is going to be a very big one.

Isn’t that what people say every year?

Here’s the reason: [House Energy and Commerce Committee] Chairwoman Cathy McMorris Rodgers has made it one of her main goals, and she’s just announced retirement. This is her last chance to get it done. 

What do you make of efforts by Meta and others to argue that it’s app store owners like Apple that need to provide a way for parents to verify their kids’ ages on their phones?

It’s an interesting idea, but I don’t think it’s a solution. The solution is engaging directly with parents. When it comes to parenting, you have to be engaged. You have to set limits yourself, not the device. That’s what being a parent is.  

Everyone freaked out that video games are going to ruin our kids. Prior to that, it was rap music. Before that, it was television. Before that, it was radio and Elvis Presley. The thing that makes me constantly optimistic is that parents are smart. Parents are capable. And, unfortunately, there are too many people out there saying parents are not able to do it. 

I think that’s a little too easy. Most parents I know are not just handing their kids a phone and saying, “Have at it.” We’re talking about what more parents can be doing. We’re talking about what more lawmakers can do to close loopholes that exist. But what more can the companies be doing?

They can always do more. And they always are doing more. Even if you don’t think that these companies have an ounce of decency in their soul and all they care about is money, then it’s important to recognize that they would still try to make their sites safe for everyone, because that’s good for business. 

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment