• Hey Trainers! Be sure to check out Corsola Beach, our newest section on the forums, in partnership with our friends at Corsola Cove! At the Beach, you can discuss the competitive side of the games, post your favorite Pokemon memes, and connect with other Pokemon creators!
  • Due to the recent changes with Twitter's API, it is no longer possible for Bulbagarden forum users to login via their Twitter account. If you signed up to Bulbagarden via Twitter and do not have another way to login, please contact us here with your Twitter username so that we can get you sorted.

Will The Kids Online Safety Act (KOSA) Censor the Internet As We Know It?

Poke Trainer J

Active Member
Joined
Dec 20, 2011
Messages
51
Reaction score
10

Electronic Frontier Foundation said:
The United States Congress has resurrected the Kids Online Safety Act (KOSA), a bill that would increase surveillance and restrict access to information in the name of protecting children online. KOSA was introduced in 2022 but failed to gain traction, and today its authors, Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), have reintroduced it with slight modifications. Though some of these changes were made in response to over 100 civil society organizations and LGBTQ+ rights groups’ criticisms of the bill, its latest version is still troubling. Today’s version of KOSA would still require surveillance of anyone sixteen and under. It would put the tools of censorship in the hands of state attorneys general, and would greatly endanger the rights, and safety, of young people online. And KOSA’s burdens will affect adults, too, who will likely face hurdles to accessing legal content online as a result of the bill.

KOSA Still Requires Filtering and Blocking of Legal Speech

Online child safety is a complex issue, but KOSA attempts to boil it down to a single solution. The bill holds platforms liable if their designs and services do not “prevent and mitigate” a list of societal ills: anxiety, depression, eating disorders, substance use disorders, physical violence, online bullying and harassment, sexual exploitation and abuse, and suicidal behaviors. Additionally, platforms would be responsible for patterns of use that indicate or encourage addiction-like behaviors.

Deciding what designs or services lead to these problems would primarily be left up to the Federal Trade Commission and 50 individual state attorneys general to decide. Ultimately, this puts platforms that serve young people in an impossible situation: without clear guidance regarding what sort of design or content might lead to these harms, they would likely censor any discussions that could make them liable. To be clear: though the bill’s language is about “designs and services,” the designs of a platform are not causing eating disorders. As a result, KOSA would make platforms liable for the content they show minors, full stop. It will be based on vague requirements that any Attorney General could, more or less, make up.

Attorneys General Would Decide What Content is Dangerous To Young People

KOSA’s co-author, Sen. Blackburn of Tennessee, has referred to education about race discrimination as “dangerous for kids.” Many states have agreed, and recently moved to limit public education about the history of race, gender, and sexuality discrimination. If KOSA passes, platforms are likely to preemptively block conversations that discuss these topics, as well as discussions about substance use, suicide, and eating disorders. As we’ve written in our previous commentary on the bill, KOSA could result in loss of access to information that a majority of people would agree is not dangerous. Again, issues like substance abuse, eating disorders, and depression are complex societal issues, and there is not clear agreement on their causes or their solutions. To pick just one example: in some communities, safe injection sites are seen as part of a solution to substance abuse; in others, they are seen as part of the problem. Under KOSA, could a platform be sued for displaying content about them—or about needle exchanges, naloxone, or other harm reduction techniques?

The latest version of KOSA tries, but ultimately fails, to address this problem in two ways: first, by clarifying that the bill shouldn’t stop a platform or its users from “providing resources for the prevention or mitigation” of its listed harms; and second, by adding that claims under the law should be consistent with evidence-informed medical information.

Unfortunately, were an Attorney General to claim that content about trans healthcare (for example) poses risks to minors’ health, they would have no shortage of ‘evidence-informed' medical information on which to base their assertion. Numerous states have laws on the books claiming that gender-affirming care for trans youth is child abuse. In an article for the American Conservative titled “How Big Tech Turns Kids Trans,” the authors point to numerous studies that indicate gender-affirming care is dangerous, despite leading medical groups recognizing the medical necessity of treatments for gender dysphoria. In the same article, the authors laud KOSA, which would prohibit “content that poses risks to minors’ physical and mental health.”

The same issue exists on both sides of the political spectrum. KOSA is ambiguous enough that an Attorney General who wanted to censor content regarding gun ownership, or Christianity, could argue that it has harmful effects on young people.

KOSA Would Still Lead to Age Verification On Platforms

Another change to KOSA comes in response to concerns that the law would lead to age verification requirements for platforms. For a platform to know whether or not it is liable for its impact on minors, it must, of course, know whether or not minors use its platform, and who they are. Age verification mandates create many issues — in particular, they undermine anonymity by requiring all users to upload identity verification documentation and share private data, no matter their age. Other types of “age assurance” tools such as age estimation also require users to upload biometric information such as their photos, and have accuracy issues. Ultimately, no method is sufficiently reliable, offers complete coverage of the population, and has respect for the protection of individuals' data and privacy and their security. France’s National Commission on Informatics and Liberty, CNIL, reached this conclusion in a recent analysis of current age verification methods.

In response to these concerns, KOSA’s authors have made two small changes, but they’re unlikely to stop platforms from implementing age verification. Earlier versions would have held platforms liable if they “knew or should have known” that an impacted user was sixteen years of age or younger. The latest version of KOSA adds “reasonableness” to this requirement, holding platforms liable if they “know or reasonably should know” a user is a minor. But legally speaking, this doesn't result in giving platforms any better guidance.

The second change is to add explicit language that age verification is not required under the “Privacy Protections” section of the bill. The bill now states that a covered platform is not required to implement an age gating or age verification functionality. But there is essentially no outcome where sites don’t implement age verification. There’s no way for platforms to block nebulous categories of content for minors without explicitly requiring age verification. If a 16-year-old user truthfully identifies herself, the law will hold platforms liable, unless they filter and block content. If a 16-year-old user identifies herself as an adult, and the platform does not use age verification, then it will still be held liable, because it should have “reasonably known” the user’s age.

A platform could, alternatively, skip age verification and simply institute blocking and filtering of certain types of content for all users regardless of age—which would be a terrible blow for speech online for everyone. So despite these bandaids on the bill, it still leaves platforms with no choices except to institute heavy-handed censorship and age verification requirements. These impacts would affect not just young people, but every user of the platform.

There Are Better Ways To Fix The Internet

While we appreciate that lawmakers have responded to concerns raised about the bill, its main requirements—that platforms must “prevent and mitigate” complex issues that researchers don’t even agree the platforms are responsible for in the first place—will lead to a more siloed, and more censored, internet. We also stand by our previous criticisms of KOSA—that it unreasonably buckets all young people into a single category, and that it requires surveillance of minors by parents. They remain troubling aspects of the law.

There is no question that some elements of social media today are toxic to users. Companies want users to spend as much time on their platforms as possible, because they make money from targeted ad sales, and these ad sales are fueled by invasive data collection. EFF has long supported stronger competition laws and comprehensive data privacy legislation in part because they can open the field to competitors to today’s social media options, and force platforms to innovate, offering more user choice. If users are unhappy with the content or design of current platforms, they should be able to move to other options that offer different forms of content moderation, better privacy protections, and other features that improve the experience for everyone, including young people.

KOSA would not enhance the ability of users to choose where they spend their time. Instead, it would shrink the number of options, by making strict requirements that only today’s largest, most profitable platforms could follow. It would solidify today’s Big Tech giants, while forcing them to collect more private data on all users. It would force them to spy on young people, and it would hand government the power to limit what topics they can see and discuss online.

It is not a safety bill—it is a surveillance and censorship bill. Please tell your Senators and representatives not to pass it.
Discuss.
 
Last edited by a moderator:
hell yeah it will.
if you dont want your kids to see stuff on the internet, then just keep them off the internet or keep an eye on them when they do access it.
not to mention what it's going to do to lgbtq people (which... look at my pfp, it makes sense i would have strong feelings about this)
also, it's an american bill that's going to affect people worldwide. i swear everytime you think that country couldnt get more egotistical it pulls another stunt like this.
so yeah. i get the sentiment but what this bill is trying to do just sucks.
 
I am Australian, so this could have an impact here as well based on what you said. I think that this is a double-edged sword, as it could begin a slippery-slop towards censorship, but at the same time, there are websites out there that already has age verification. For example, many social media websites (such as Facebook) has a minimum age of 13 and there are adult websites that has age verification that blocks anyone under the age of 18 from accessing such websites (and exposing minors to things like pornography is a criminal offence that can lead to jail time and being placed on a register), and there are things that should not be online anyway (such as instructional videos to commit acts of terrorism or even worse, child abuse and child pornography) and conspiracy theory websites (which basically egged on the insurrections on January 6 2021, and no doubt will have a field day on this issue). However, at the same time, we need to protect vulnerable groups as well such as members of the LGBTQ+ community, migrants, indigenous people, people with disabilities, etc.) from abuse online from people who are adults and should know better than to abuse vulnerable groups.

So in conclusion, this issue can be a double-edged sword in which it could protect people, but at the same time could censor the internet. Even though I am not American, a lot of social media websites are privately-owned and are therefore not protected by the Freedom of Speech clause of the First Amendment (in fact, as this website is privately-owned, it has a Terms and Conditions that we need to follow or we would get a warning or even banned by moderators). So if it is a government website, then the First Amendment applies, but on internet forums and social media that is privately run (such as Bulbagarden), it does not apply. But first and foremost, supervision of children while online, as well as proper education (such as media and digital literacy educaton for both children and adults) is key to solving this problem.
 
C/W: Very strong language and an overall negative tone incoming, as this one put my brain into overdrive a lot more than I was prepared for. I’m actually glad that I got all of it out, instead of having it eat me up from the inside.

I’m glad to know that my bullshit radar really does work better than I think it does, at least. Those commercials really did have that swarmy-in-an-obnoxiously-unearned-way kind of feeling to them, even by the pitifully low standards of political ads. There might be hope for me and my ditzy ass yet, haha.

There is much to be said about the complicity of social media companies with how they very obviously prey on the vulnerabilities of children — and children-in-all-but-name, in mind rather than spirit, to be clear — in the name of those almighty ad dollars. Or in other words: just that good old green as it’s been for all sorts of unscrupulous, sociopathic organizations since time immaterial. Nothing new under the sun, indeed, whether a shiny new phone or computer is involved in it all or just good old face-to-face hustling.

That said, also nothing new under the sun is the use of such easy, unsympathetic targets as scapegoats for shitty, irresponsible parents to defer responsibility to, allowing them to escape having to face their vast and catastrophic failures as, again, shitty and irresponsible parents. For all of the hoopla about the supposed sanctity of parenthood and all of that, parents really have only one job when it comes down to it, and one job alone: to teach their children how to survive independently in an often cold and unforgiving world. That’s literally it. It’s not to impose their “values” on their children or to mold them into a “good example” for their family or culture or anything like that, or even to “protect” them beyond what’s necessary for immediate survival against clear and present dangers, rather than scapegoats and boogeymen of nebulous actual relevance to a child’s safety and well-being. And what said parents think about their children and their own values is their most completely and utterly irrelevant concern of all, because children are not their parents and, again and to make it damn clear, the latter have only that one fucking job; nothing more, nothing less.

Of course, certain parents are all too willing to amend their mandate as guardians over their children when it’s convenient for them, all while simultaneously playing the helpless card whenever it’s inconvenient for them. It never fails to astound me how certain parents are so willing to do things like force their gay and lesbian children to “pray the gay away” (see: countless so-called “Christian” gay conversion programs), or refuse to allow gender affirming care for their trans children (see: the gaslighting of gender dysphoria sufferers as just “going through a phase”, when they’re not just flat-out equating transness with child sexualization or worse), or refuse to allow the teaching of the full history of our nation to their children including Black and brown history (see: Florida, the most infamous out of many other examples), or literally murder their own disabled and neurodivergent children in cold blood or even entertain the idea of doing so (see: child murderer-adjacents like Alison Singer of Autism Speaks infamy for the latter — no, she’s not forgiven and never can be — and one too many countless examples of the former who I won’t give the dignity of naming here). And how even with the vast, deeply entrenched privilege wielded by these parents that allows them to get away with such heinous bullshit on its face; how even with such an endlessly effortless ability to flex what is basically absolute power and impunity over their children, they’ll still try to actually justify themselves. And how even then — even with the occasional high level of effort that they afford to bolstering said justifications — it all just really boils down to: “it’s because I’m a parent”. It’s just staggering, the straight-faced self-righteousness of all; the kind of the most dizzyingly circular variety that makes up look like down and 1+1 look like 3; the kind where child abuse, neglect, and murder is made to look like love. And the ignorant and weak-minded ensue that such parents are constantly given the benefit of the doubt in society, even when there is no doubt that they’re all guilty as sin in their sheer drunkenness of endless, unchecked power.

And yet, when it comes to something as simple and entirely within the power of almighty parents as monitoring their child’s internet access and blocking problematic sites if the situation really calls for it, keeping track of everything that their children do out of their sight if said children really are that naïve and untrustworthy, or, you know, just not giving them a goddamned smartphone or internet access in the first place if they’re really that incapable of using either responsibly, then suddenly said almighty parents are so completely helpless, and suddenly it’s the responsibility of social media companies — who said parents have already established as untrustworthy if not flat-out predatory towards children — to play caretaker and babysitter. Suddenly, it’s the social media companies who have to take the fall and surrender their rights in such a way that it affects everyone — responsible parents and children included — all while the shitty, irresponsible parents become all but sainted heroes for being derelict in their one fucking job; one that they should theoretically be more than capable of handling given their vast amounts of privilege that allows them to get away with far worse things, when they’re not being celebrated or cheered on for doing them.

Also, something something “those who are willing to sacrifice a little liberty for a little safety deserve neither”. Freedom isn’t free, indeed, but it seems that people are only willing to invoke that phrase when it’s meant to mean: “let’s blow stuff up in another country!”, or: “don’t let them take away our guns!”, or something along those lines. Never when it’s meant to mean real freedom, for all.

And also, let’s not pretend that anyone in Congress actually gives a damn about children and that this bill, among its many predecessors, isn’t really just one of a long line of examples of politicians exploiting the idea of children as innocent, helpless ingenues in order to gain political capital for “thinking of the children”. As if a bunch of old men and women — old men and women who don’t even represent the full racial, cultural, and gender makeup of the nation, at that — know anything about what America’s children need in this day and age, or that they’re not complicit in said children’s suffering themselves through their seemingly endless ignorance, born of their privilege as adults in a grown-up’s world. Privilege unearned, as usual, with many of those so-called adults being no more mature or knowledgable than the children that they so arbitrarily declare themselves superior to.

/vent, whew.
TL;DR: Don’t give parents more power when they already have too much power, which is too often abused by them…
 
Last edited by a moderator:
Please note: The thread is from 8 months ago.
Please take the age of this thread into consideration in writing your reply. Depending on what exactly you wanted to say, you may want to consider if it would be better to post a new thread instead.
Back
Top Bottom