Congress has resurrected the Kids Online Safety Act (KOSA), a bill that would increase surveillance and restrict access to information in the name of protecting children online. KOSA was introduced
Congress has resurrected the Kids Online Safety Act (KOSA), a bill that would increase surveillance and restrict access to information in the name of protecting children online. KOSA was introduced in 2022 but failed to gain traction, and today its authors, Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), have reintroduced it with slight modifications. Though some of these changes were made in response to over 100 civil society organizations and LGBTQ+ rights groups’ criticisms of the bill, its latest version is still troubling. Today’s version of KOSA would still require surveillance of anyone sixteen and under. It would put the tools of censorship in the hands of state attorneys general, and would greatly endanger the rights, and safety, of young people online. And KOSA’s burdens will affect adults, too, who will likely face hurdles to accessing legal content online as a result of the bill.
TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT
KOSA Still Requires Filtering and Blocking of Legal Speech
Online child safety is a complex issue, but KOSA attempts to boil it down to a single solution. The bill holds platforms liable if their designs and services do not “prevent and mitigate” a list of societal ills: anxiety, depression, eating disorders, substance use disorders, physical violence, online bullying and harassment, sexual exploitation and abuse, and suicidal behaviors. Additionally, platforms would be responsible for patterns of use that indicate or encourage addiction-like behaviors.
Deciding what designs or services lead to these problems would primarily be left up to the Federal Trade Commission and 50 individual state attorneys general to decide. Ultimately, this puts platforms that serve young people in an impossible situation: without clear guidance regarding what sort of design or content might lead to these harms, they would likely censor any discussions that could make them liable. To be clear: though the bill’s language is about “designs and services,” the designs of a platform are not causing eating disorders. As a result, KOSA would make platforms liable for the content they show minors, full stop. It will be based on vague requirements that any Attorney General could, more or less, make up.
Attorneys General Would Decide What Content is Dangerous To Young People
KOSA’s co-author, Sen. Blackburn of Tennessee, has referred to education about race discrimination as “dangerous for kids.” Many states have agreed, and recently moved to limit public education about the history of race, gender, and sexuality discrimination. If KOSA passes, platforms are likely to preemptively block conversations that discuss these topics, as well as discussions about substance use, suicide, and eating disorders. As we’ve written in our previous commentary on the bill, KOSA could result in loss of access to information that a majority of people would agree is not dangerous. Again, issues like substance abuse, eating disorders, and depression are complex societal issues, and there is not clear agreement on their causes or their solutions. To pick just one example: in some communities, safe injection sites are seen as part of a solution to substance abuse; in others, they are seen as part of the problem. Under KOSA, could a platform be sued for displaying content about them—or about needle exchanges, naloxone, or other harm reduction techniques?
The latest version of KOSA tries, but ultimately fails, to address this problem in two ways: first, by clarifying that the bill shouldn’t stop a platform or its users from “providing resources for the prevention or mitigation” of its listed harms; and second, by adding that claims under the law should be consistent with evidence-informed medical information.
Unfortunately, were an Attorney General to claim that content about trans healthcare (for example) poses risks to minors’ health, they would have no shortage of ‘evidence-informed’ medical information on which to base their assertion. Numerous states have laws on the books claiming that gender-affirming care for trans youth is child abuse. In an article for the American Conservative titled “How Big Tech Turns Kids Trans,” the authors point to numerous studies that indicate gender-affirming care is dangerous, despite leading medical groups recognizing the medical necessity of treatments for gender dysphoria. In the same article, the authors laud KOSA, which would prohibit “content that poses risks to minors’ physical and mental health.”
The same issue exists on both sides of the political spectrum. KOSA is ambiguous enough that an Attorney General who wanted to censor content regarding gun ownership, or Christianity, could argue that it has harmful effects on young people.
TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT
KOSA Would Still Lead to Age Verification On Platforms
Another change to KOSA comes in response to concerns that the law would lead to age verification requirements for platforms. For a platform to know whether or not it is liable for its impact on minors, it must, of course, know whether or not minors use its platform, and who they are. Age verification mandates create many issues — in particular, they undermine anonymity by requiring all users to upload identity verification documentation and share private data, no matter their age. Other types of “age assurance” tools such as age estimation also require users to upload biometric information such as their photos, and have accuracy issues. Ultimately, no method is sufficiently reliable, offers complete coverage of the population, and has respect for the protection of individuals’ data and privacy and their security. France’s National Commission on Informatics and Liberty, CNIL, reached this conclusion in a recent analysis of current age verification methods.
In response to these concerns, KOSA’s authors have made two small changes, but they’re unlikely to stop platforms from implementing age verification. Earlier versions would have held platforms liable if they “knew or should have known” that an impacted user was sixteen years of age or younger. The latest version of KOSA adds “reasonableness” to this requirement, holding platforms liable if they “know or reasonably should know” a user is a minor. But legally speaking, this doesn’t result in giving platforms any better guidance.
The second change is to add explicit language that age verification is not required under the “Privacy Protections” section of the bill. The bill now states that a covered platform is not required to implement an age gating or age verification functionality. But there is essentially no outcome where sites don’t implement age verification. There’s no way for platforms to block nebulous categories of content for minors without explicitly requiring age verification. If a 16-year-old user truthfully identifies herself, the law will hold platforms liable, unless they filter and block content. If a 16-year-old user identifies herself as an adult, and the platform does not use age verification, then it will still be held liable, because it should have “reasonably known” the user’s age.
A platform could, alternatively, skip age verification and simply institute blocking and filtering of certain types of content for all users regardless of age—which would be a terrible blow for speech online for everyone. So despite these bandaids on the bill, it still leaves platforms with no choices except to institute heavy-handed censorship and age verification requirements. These impacts would affect not just young people, but every user of the platform.
There Are Better Ways to Fix The Internet
While we appreciate that lawmakers have responded to concerns raised about the bill, its main requirements—that platforms must “prevent and mitigate” complex issues that researchers don’t even agree the platforms are responsible for in the first place—will lead to a more siloed, and more censored, internet. We also stand by our previous criticisms of KOSA—that it unreasonably buckets all young people into a single category, and that it requires surveillance of minors by parents. They remain troubling aspects of the law.
There is no question that some elements of social media today are toxic to users. Companies want users to spend as much time on their platforms as possible, because they make money from targeted ad sales, and these ad sales are fueled by invasive data collection. EFF has long supported stronger competition laws and comprehensive data privacy legislation in part because they can open the field to competitors to today’s social media options, and force platforms to innovate, offering more user choice. If users are unhappy with the content or design of current platforms, they should be able to move to other options that offer different forms of content moderation, better privacy protections, and other features that improve the experience for everyone, including young people.
KOSA would not enhance the ability of users to choose where they spend their time. Instead, it would shrink the number of options, by making strict requirements that only today’s largest, most profitable platforms could follow. It would solidify today’s Big Tech giants, while forcing them to collect more private data on all users. It would force them to spy on young people, and it would hand government the power to limit what topics they can see and discuss online.
It is not a safety bill—it is a surveillance and censorship bill. Please tell your Senators and representatives not to pass it.
The post The Kids Online Safety Act is Still A Huge Danger to Our Rights Online | Electronic Frontier Foundation appeared first on TheWatchTowers.org.