Techno Blender
Digitally Yours.

Social-Media Firms Would Have to Consider Children’s Health Under Bill Passed by California Legislature

0 63


California’s Legislature passed a bill Tuesday that would for the first time in the U.S. require the makers of social-media apps such as Facebook, Instagram and TikTok to consider the physical and mental health of minors when designing their products.

The bill passed in a unanimous, bipartisan vote in the Assembly after doing the same in the state Senate on Monday. Both chambers are dominated by Democrats.

Gov.

Gavin Newsom

hasn’t indicated whether he would sign or veto the bill. A spokesman for the Democrat declined to comment.

“California is home to the tech innovation space, and we welcome that,” state Assembly member Buffy Wicks, a Democrat and the bill’s primary author, said at a news conference urging Gov. Gavin Newsom to sign the bill Tuesday morning. “But I also want to make sure our children are safe, and right now, they are not safe.”

Social-media companies opposed the bill, arguing that differing state laws regulating their apps would make compliance difficult.

Its passage comes after the failure of a separate measure that would have allowed government lawyers to sue social-media companies when their apps cause harm or addiction in children.

Representatives for companies including

Meta Platforms Inc.,

META -1.26%

Snap Inc.

SNAP -2.53%

and

Twitter Inc.

TWTR -1.80%

lobbied aggressively against that measure.

The bill that passed Tuesday would require social-media companies to study products and features deemed likely to be accessed by minors to assess and mitigate potential harm before releasing them publicly. Those assessments would have to be given to the state attorney general, if requested, though the contents wouldn’t be subject to public disclosure.

It would also require companies to disclose their privacy policies in language children can understand and prohibit profiling of minors and the use of tools that encourage children to share personal information.

In addition, it would prohibit companies from precise geolocation tracking unless the child is notified and would ban companies from using children’s personal information in ways that are deemed to be detrimental to their health.

SHARE YOUR THOUGHTS

Should social-media companies design their apps with specific rules to limit their potential harm to children? Why or why not? Join the conversation below.

Companies found to violate the rules could face injunctions on their products and be fined up to $2,500 per affected child for each violation and up to $7,500 per child if the violation was intentional.

If signed by Mr. Newsom, the bill’s provisions would go into effect in July 2024.

Ms. Wicks said the bill is modeled after a similar law in the U.K. that requires social-media companies to build their products with children in mind. For instance,

Alphabet Inc.’s

Google has made safe search, which screens out potentially inappropriate content, its default browsing mode in the U.K., while TikTok and Instagram have disabled direct messaging between children and adults they don’t follow.

Representatives for Meta and TechNet, a tech-industry trade group, previously said they preferred a bill that regulated the design of their products, rather than one making them liable for harm to children, like the one that failed earlier this month.

But the industry representatives said they still oppose the measure that passed and are encouraging Mr. Newsom to veto it.

“We support the intent of this bill, and protecting children online remains a priority. But it must be done responsibly and effectively,” said Dylan Hoffman, executive director of California and the Southwest for TechNet, which lobbied extensively against both measures. “While this bill has improved, we remain concerned about its unintended consequences in California and across the country.”

He said TechNet is instead advocating for a federal privacy law that would provide national standards to protect kids online.

Lobbyists representing the tech industry and social-media companies unsuccessfully pushed to reduce the applicable age of the bill to 13 and for it to apply only to products and services “directed at” children rather than “likely to be accessed” by them.

Ms. Wicks has said she considered the bill that failed and the one that passed to be complementary measures that would require social-media companies to give priority to the mental health of teens.

Senators on both sides of the aisle accused Facebook of disregarding internal research that showed its Instagram app is harmful for significant numbers of teen girls. WSJ’s Ryan Tracy unpacks the takeaways from testimony by the company’s global head of safety. Published: 10/1/2021. Photo illustration: Diana Chan

Social-media companies have come under fire over the past year for the impact that platforms can have on young people. Last fall, reporting from The Wall Street Journal and subsequent congressional hearings revealed internal research from then-Facebook that suggested the company knew its algorithms were harming the mental health of teenage girls.

Meta Chief Executive

Mark Zuckerberg

has said the hearings painted a false picture of the company while officials of Meta have said the internal research was inconclusive.

Nichole Rocha, head of U.S. affairs for the 5Rights Foundation, a group that advocates for children online and supported the bill, said she and other proponents have had dozens of meetings with tech-industry stakeholders, trying to find common ground.

Among the changes won by tech advocates was the establishment of companies’ “right to cure” to avoid penalties and an extension of that period to 90 days from the original 45.

“I think we’ve given a lot of concessions,” Ms. Rocha said. “They’re still opposed.”

Also Tuesday, the Legislature approved a bill that would require social-media companies to disclose their policies aimed at online hate, disinformation and extremism and to disclose metrics and data about enforcement of those policies.

Write to Christine Mai-Duc at [email protected] and Meghan Bobrowsky at [email protected]

Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8


California’s Legislature passed a bill Tuesday that would for the first time in the U.S. require the makers of social-media apps such as Facebook, Instagram and TikTok to consider the physical and mental health of minors when designing their products.

The bill passed in a unanimous, bipartisan vote in the Assembly after doing the same in the state Senate on Monday. Both chambers are dominated by Democrats.

Gov.

Gavin Newsom

hasn’t indicated whether he would sign or veto the bill. A spokesman for the Democrat declined to comment.

“California is home to the tech innovation space, and we welcome that,” state Assembly member Buffy Wicks, a Democrat and the bill’s primary author, said at a news conference urging Gov. Gavin Newsom to sign the bill Tuesday morning. “But I also want to make sure our children are safe, and right now, they are not safe.”

Social-media companies opposed the bill, arguing that differing state laws regulating their apps would make compliance difficult.

Its passage comes after the failure of a separate measure that would have allowed government lawyers to sue social-media companies when their apps cause harm or addiction in children.

Representatives for companies including

Meta Platforms Inc.,

META -1.26%

Snap Inc.

SNAP -2.53%

and

Twitter Inc.

TWTR -1.80%

lobbied aggressively against that measure.

The bill that passed Tuesday would require social-media companies to study products and features deemed likely to be accessed by minors to assess and mitigate potential harm before releasing them publicly. Those assessments would have to be given to the state attorney general, if requested, though the contents wouldn’t be subject to public disclosure.

It would also require companies to disclose their privacy policies in language children can understand and prohibit profiling of minors and the use of tools that encourage children to share personal information.

In addition, it would prohibit companies from precise geolocation tracking unless the child is notified and would ban companies from using children’s personal information in ways that are deemed to be detrimental to their health.

SHARE YOUR THOUGHTS

Should social-media companies design their apps with specific rules to limit their potential harm to children? Why or why not? Join the conversation below.

Companies found to violate the rules could face injunctions on their products and be fined up to $2,500 per affected child for each violation and up to $7,500 per child if the violation was intentional.

If signed by Mr. Newsom, the bill’s provisions would go into effect in July 2024.

Ms. Wicks said the bill is modeled after a similar law in the U.K. that requires social-media companies to build their products with children in mind. For instance,

Alphabet Inc.’s

Google has made safe search, which screens out potentially inappropriate content, its default browsing mode in the U.K., while TikTok and Instagram have disabled direct messaging between children and adults they don’t follow.

Representatives for Meta and TechNet, a tech-industry trade group, previously said they preferred a bill that regulated the design of their products, rather than one making them liable for harm to children, like the one that failed earlier this month.

But the industry representatives said they still oppose the measure that passed and are encouraging Mr. Newsom to veto it.

“We support the intent of this bill, and protecting children online remains a priority. But it must be done responsibly and effectively,” said Dylan Hoffman, executive director of California and the Southwest for TechNet, which lobbied extensively against both measures. “While this bill has improved, we remain concerned about its unintended consequences in California and across the country.”

He said TechNet is instead advocating for a federal privacy law that would provide national standards to protect kids online.

Lobbyists representing the tech industry and social-media companies unsuccessfully pushed to reduce the applicable age of the bill to 13 and for it to apply only to products and services “directed at” children rather than “likely to be accessed” by them.

Ms. Wicks has said she considered the bill that failed and the one that passed to be complementary measures that would require social-media companies to give priority to the mental health of teens.

Senators on both sides of the aisle accused Facebook of disregarding internal research that showed its Instagram app is harmful for significant numbers of teen girls. WSJ’s Ryan Tracy unpacks the takeaways from testimony by the company’s global head of safety. Published: 10/1/2021. Photo illustration: Diana Chan

Social-media companies have come under fire over the past year for the impact that platforms can have on young people. Last fall, reporting from The Wall Street Journal and subsequent congressional hearings revealed internal research from then-Facebook that suggested the company knew its algorithms were harming the mental health of teenage girls.

Meta Chief Executive

Mark Zuckerberg

has said the hearings painted a false picture of the company while officials of Meta have said the internal research was inconclusive.

Nichole Rocha, head of U.S. affairs for the 5Rights Foundation, a group that advocates for children online and supported the bill, said she and other proponents have had dozens of meetings with tech-industry stakeholders, trying to find common ground.

Among the changes won by tech advocates was the establishment of companies’ “right to cure” to avoid penalties and an extension of that period to 90 days from the original 45.

“I think we’ve given a lot of concessions,” Ms. Rocha said. “They’re still opposed.”

Also Tuesday, the Legislature approved a bill that would require social-media companies to disclose their policies aimed at online hate, disinformation and extremism and to disclose metrics and data about enforcement of those policies.

Write to Christine Mai-Duc at [email protected] and Meghan Bobrowsky at [email protected]

Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment