WASHINGTON (Reuters) - The US Senate is expected to pass major online child safety reforms in a vote on Tuesday, although the legislation, which has drawn mixed reactions from the tech industry, faces an uncertain fate in the House of Representatives.
Two bills - the Children and Teens' Online Privacy Protection Act and the Kids Online Safety Act, nicknamed COPPA 2.0 and KOSA - would need to pass in the Republican-controlled House, currently on recess until September, to become law.
The bills were approved by the Senate in a bipartisan procedural vote last week, with 86 senators supporting and just one opposing. Democrats control that chamber by a margin of 51-49 seats, while Republicans hold the House by 220-212.
COPPA 2.0 would ban targeted advertising to minors and data collection without their consent, and give parents and kids the option to delete their information from social media platforms.
Top U.S. social media platforms made an estimated $11 billion in advertising revenue from users younger than 18 in 2022, according to a Harvard study published last year.
KOSA would make explicit a "duty of care" that social media companies have when it comes to minors using their products, focusing on the design of the platforms and regulation of the companies.
Executives at social media sites Snap Inc. and X said at a congressional hearing in January that they supported KOSA, while Facebook and Instagram owner Meta Platforms CEO Mark Zuckerberg and TikTok Chief Executive Shou Zi Chew said they disagreed with parts of it.
Tech industry groups and the American Civil Liberties Union have criticized the bill, saying that differing interpretations of harmful content could result in minors losing access to content related to vaccines, abortion, or LGBTQ issues.
Senators amended the language of the bill in response to such concerns earlier this year, in part by limiting the enforcement responsibility of states' attorneys general.
Josh Golin, executive director at Fairplay for Kids, a group that supports the bills, said KOSA requires companies to mitigate specific risks, such as content that promotes eating disorders.
"Obviously government officials can do things that are not legal, but this does not give government officials any legal basis for censorship," he said.