The Senate has approved two major online safety bills amid years of debate over the impact of social media on teen mental health. The Children’s Online Safety Act (KOSA) and the Children and Teens Online Privacy Protection Act, also known as COPPA 2.0, passed the Senate in a 91-3 vote.
The bills will now move to the House of Representatives, though it is unclear whether they will have enough support to pass. If passed, they would be the most significant laws regulating tech companies in years.
The KOSA Act requires social media companies like Meta to offer controls to disable algorithmic feeds and other “addictive” features for children under 16. It also requires companies to provide parental monitoring features and protect minors from content that promotes eating disorders, self-harm, sexual exploitation, and other harmful content.
One of the most controversial provisions of the bill creates what is known as a “duty of care.” This means that platforms are required to prevent or mitigate certain harmful effects of their products, such as “addictive” features or algorithms that promote dangerous content. The Federal Trade Commission would be in charge of enforcing the rule.
The bill was originally introduced but stalled amid pushback from digital rights and other advocacy groups who said the legislation would force platforms to A revised version, intended to address some of those concerns, was introduced last year, though the ACLU, EFF and other free speech groups still oppose the bill. In a statement last week, the ACLU said KOSA would encourage social media companies “to censor free speech” and “incentivize the elimination of anonymous browsing across broad swaths of the Internet.”
On the other hand, COPPA 2.0 has been less controversial among privacy advocates. It is a law that, as part of the Children and Teens Online Privacy Protection Act of 1998, aims to overhaul the nearly 30-year-old law to better reflect the modern internet and social media landscape. If passed, the law would prohibit companies from targeting advertising to children and collecting personal data from teens between the ages of 13 and 16 without consent. It also requires companies to offer a “delete button” for personal data to remove children and teens’ personal information from a platform when “technologically feasible.”
The vote underscores how online safety has become a rare source of bipartisan agreement in the Senate, which has hosted numerous hearings on teen safety issues in recent years. The CEOs of Meta, Snap, Discord, x and TikTok spoke at one such hearing earlier this year, during which South Carolina Sen. Lindsey Graham accused the executives of having “blood on their hands” for numerous security failures.