The Kids Online Safety Act (KOSA) would censor the internet and would make government officials the arbiters of what young people can see online. It will likely lead to age verification, handing more power, and private data, to third-party identity verification companies like Clear or ID.me. The government should not have the power to decide what topics are “safe” online for young people, and to force services to remove and block access to anything that might be considered unsafe for children. This isn’t safety—it’s censorship.
Posted by a kid. Or a groomer.
Politics in the Mirror eh?
So, please correct me if I’m missing something. IANAL. I went ahead and read parts of the bill:
Nothing really jumps out as problematic for me. It’s preventing harmful content from being advertised to via social media to children, but not stopping kids from independently researching things.
**> (a) Prevention Of Harm To Minors.—A covered platform shall act in the best interests of a user that the platform knows or reasonably should know is a minor by taking reasonable measures in its design and operation of products and services to prevent and mitigate the following:
(1) Consistent with evidence-informed medical information, the following mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.**
I think we can agree that evidence informed medical information to prevent these mental health issues is a positive thing.
>(2) Patterns of use that indicate or encourage addiction-like behaviors.
This could lead to a reduction in serotonin hacking apps that are worthless to the consumer, doomscrolling, etc.
>(3) Physical violence, online bullying, and harassment of the minor.
Good
>(4) Sexual exploitation and abuse.
The definition of sexual exploitation and abuse is probably the thing that has the LGBTQ community concerned. What is their definition?
Taking a detour, we can find what the definition is earlier in the bill:_>(10) SEXUAL EXPLOITATION AND ABUSE.—The term “sexual exploitation and abuse” means any of the following:
(A) Coercion and enticement, as described in section 2422 of title 18, United States Code._
Title 18 Section 2422 (a)Whoever knowingly persuades, induces, entices, or coerces any individual to travel in interstate or foreign commerce, or in any Territory or Possession of the United States, to engage in prostitution, or in any sexual activity for which any person can be charged with a criminal offense, or attempts to do so, shall be fined under this title or imprisoned not more than 20 years, or both. (b)Whoever, using the mail or any facility or means of interstate or foreign commerce, or within the special maritime and territorial jurisdiction of the United States knowingly persuades, induces, entices, or coerces any individual who has not attained the age of 18 years, to engage in prostitution or any sexual activity for which any person can be charged with a criminal offense, or attempts to do so, shall be fined under this title and imprisoned not less than 10 years or for life.
Okay, so, grooming minors and encouraging them to come visit you, or to kidnap them and force them into prostitution.
>(B) Child sexual abuse material, as described in sections 2251, 2252, 2252A, and 2260 of title 18, United States Code.
Child pornography.
>© Trafficking for the production of images, as described in section 2251A of title 18, United States Code.
Distributing said pornography.
>(D) Sex trafficking of children, as described in section 1591 of title 18, United States Code.
Coercion, force and fraud for commercial sex acts with minors.
Feel free to review Title 18 here
(5) Promotion and marketing of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol.
Besides the obvious of not trying to advertise to kids to use vapes or drink boiled nyquil, this could lead to the end of lootboxes with the gambling provision.
(6) Predatory, unfair, or deceptive marketing practices, or other financial harms.
This can only be a good thing. No more helping John Wick by sharing your parents credit card number.
The bill continues:
(b) Limitation.—Nothing in subsection (a) shall be construed to require a covered platform to prevent or preclude—
(1) any minor from deliberately and independently searching for, or specifically requesting, content; or
(2) the covered platform or individuals on the platform from providing resources for the prevention or mitigation of suicidal behaviors, substance use, and other harms, including evidence-informed information and clinical resources.This says to me that the social media platforms cannot actively advertise this stuff to people they deem minors, not that the content needs to be completely scrubbed from the web. So long as the subject matter isn’t ending up on a 14 year old’s FYP, they’re good to go.
I ran out of characters so I had to delete like half my post. But yeah, TLDR the bill prevents advertising of harmful material, not a full on block of content.
You are definitely not a lawyer, and the people backing these bills intentionally use language that creates a specious justification for the erosion of privacy and freedom online.
This bill will require everyone to start using their government ID to post just about anything online, while allowing state AGs to censor basically anything they want in bad faith.
The Heritage Foundation, a right-wing hate group, has already made clear that they will use this to censor any/all LGBTQIA+ material.
Here is a lawyer providing a more detailed thread explaining the issues with this bill.
You are definitely not a lawyer
Correct, but there’s no need to be rude.
Let’s take a look at what Ari Cohn is arguing:
“Platforms will still have to age-verify users, violating their First Amendment right to speak and access content anonymously,”
“The revisions made to KOSA just trade an explicit mandate for a vague one. Uncertainty about when knowledge of a user’s age will be implied leads to the same result as before: the only way a platform can be confident it is in compliance is by age-verifying every user. At best, language purporting not to require such verification ignores this practical reality. At worst, it is a deliberate obfuscation of the bill’s intended effect.”Yeah, that was part of what I originally wrote and then had to delete. In retrospect I should have just split it and made replies. Oh well.
The bill mentions:SEC. 9. AGE VERIFICATION STUDY AND REPORT.
(a) Study.—The Director of the National Institute of Standards and Technology, in coordination with the Federal Communications Commission, Federal Trade Commission, and the Secretary of Commerce, shall conduct a study evaluating the most technologically feasible methods and options for developing systems to verify age at the device or operating system level.
(b) Contents.—Such study shall consider —
(1) the benefits of creating a device or operating system level age verification system;
(2) what information may need to be collected to create this type of age verification system;
(3) the accuracy of such systems and their impact or steps to improve accessibility, including for individuals with disabilities;
(4) how such a system or systems could verify age while mitigating risks to user privacy and data security and safeguarding minors’ personal data, emphasizing minimizing the amount of data collected and processed by covered platforms and age verification providers for such a system; and
(5) the technical feasibility, including the need for potential hardware and software changes, including for devices currently in commerce and owned by consumers.
© Report.—Not later than 1 year after the date of enactment of this Act, the agencies described in subsection (a) shall submit a report containing the results of the study conducted under such subsection to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives.So there isn’t necessarily a plan for Real ID out of the box, the study would have to be conducted to determine feasibility of what age verification method would be best. I understand the concerns about sharing your personal ID online. It could very well come to a conclusion that the algorithms already in place are plenty good enough to determine what age someone is likely, how my FYP on TikTok is filled with Millenial content just based on what content I liked. But sure, the possibility of having to register your personal ID with every social media company doesn’t sound too appetizing.
Continued In Reply
I think we’ll just have to wait and see how tech companies implement this and how it’s enforced. Even the study is, as the letter points out, just guidance and not enforceable and can be ignored. The bill itself contains very little beyond saying that it doesn’t explicitly enforce “age gating” and extra data collection to determine age.
Also, as the letter itself points out
To date, COPPA has had negligible effects on adults because services directed to children under 13 are unlikely to be used by anyone other than children due to their limited functionality, effectively mandated by COPPA. But extending COPPA’s framework to sites “directed to” older teens would significantly burden the speech of adults because the social media services and games that older teens use are largely the same ones used by adults.
Would it be impossible to create separation between sites used by older teens and adults? A lot of it happens culturally anyway. I’m not as pessimistic as others are about this.
“KOSA’s duty of care is an unfixable idea that is impossible to satisfy and a violation of the First Amendment,” Cohn continued. “Minors are not a monolith, and what hurts one may help another. Requiring platforms to protect the vague, nonexistent best interests of minors as a whole will limit minors to only the blandest material safe for the most sensitive individual. This chilling effect is the precise reason courts have consistently held for decades that imposing a duty to protect listeners from harmful reactions to speech is unconstitutional.”
I think it’s pretty clear what content is unsuitable, it doesn’t seem very vague to me. You can’t realistically specify everything. As an example, 10 years ago I would have never predicted Mukbang, but it’s insanely popular. Watching someone eat themselves into health issues and inspiring other people to do the same? There’s no benefit. It’s gross, it’s wasteful, it’s unhealthy, but it grabs people’s attention. With KOSA, that content can still exist, but they won’t be telling kids “just eat a bunch of crap food and you can be famous like Niko Avocado”. I think I’m OK with that.
“State attorneys general of all persuasions will find KOSA a useful tool in purging the Internet of content they disfavor,” Cohn concluded. “From hateful speech to LGBTQ content, KOSA’s duty of care provides the kind of ready-made censorship tool that ambitious attorneys general could only dream of. The burden and expense of a state investigation alone may be sufficient to pressure platforms to take down or restrict access to protected expression. Handing a weapon to politically motivated actors who have demonstrated that they will use any tools at their disposal to silence speech they disagree with is grossly irresponsible.”
The content of this bill says to me that it prevents advertising specific content, not completely removing that content. Is there evidence informed medical information that says LGBTQ content causes any of the listed mental health issues? I don’t think so. Nothing in the sexual exploitation section seems to even give wiggle room to it saying LGBTQ content could be considered. Asshole conservatives in power will twist laws in crazy ways. However, we shouldn’t stop legislating things just because a small potential exists. The internet is a cesspool and it should be made a little bit safer for people who can’t reason out they are being exploited.
I think the conversation should be preventing abuse of laws in general. The letter of this bill doesn’t seem bad, but I absolutely can see how it could be manipulated, such as a backdoor for Real ID. But the bill couldn’t be used to completely remove content from the internet, only reducing things being recommended. It specifically says on the bill that the bill does not allow the complete removal of content, it’s just to prevent advertising some content to kids.
I’m happy to continue the dialogue, if you are @MiscreantMouse
Again, I think you are being very naive about the language in this bill, and attempting to apply a common use interpretation, rather than a legal interpretation. It doesn’t matter what the bill says to you, it matters what the bill means for the legal system.
Why do you think that so many legal & tech professionals are up in arms about this bill? Here is more information about the GOP plans to use this bill to censor LGBTQIA+ content.
I think the conversation should be preventing abuse of laws in general.
How do you expect this to happen in the real world? The GOP is very open about their plans to abuse this law, how do you expect to stop them?
I’ll do a little more reading a little later regarding your link, I do want to say however it is incredibly frustrating to try to navigate an article such as the one shared from techdirt that only links to itself and no outside sources. It makes verifying their claims harder than it should be.
Lol, ok, I’m sorry it’s so difficult. Anyway, it’s included in the link I provided above, but the ACLU, EFF, GLAAD and over 90 organizations have sent an open letter to congress outlining the dangers in this bill, so those ‘claims’ shouldn’t be too hard to verify.
I was referring to the link here:
Every hyperlink in that article just links back to it’s own website, which makes it hard to verify the claims it is making.
The letter you provided from the ACLU, et al is a response to an older version of the bill, located here:
I do not have time to review the older bill and compare it to the newer bill, but I think it’s safe to say that because the previous bill was met with dissatisfaction that it was rewritten to address their concerns.