Establishes the New York children's online safety act; requires operators of covered platforms to conduct age verification to determine whether a user is a covered minor; requires operators of covered platforms to utilize certain default privacy settings for covered minors; requires operators of covered platforms to require parental approval of certain activity related to a covered minor's covered platform account.
NEW YORK STATE ASSEMBLY MEMORANDUM IN SUPPORT OF LEGISLATION submitted in accordance with Assembly Rule III, Sec 1(f)
 
BILL NUMBER: A6549
SPONSOR: Rozic
 
TITLE OF BILL:
An act to amend the general business law, in relation to establishing
the New York children's online safety act
 
PURPOSE::
The purpose of this bill is to prevent the explosive growth of child
predators on certain digital platforms by defaulting to certain privacy
and security settings for child users.
 
SUMMARY OF SPECIFIC PROVISIONS::
Section one of this bill names it the New York Children's Online Safety
Act (NYCOSA).
Section two of this bill creates a new Article 45-A in General Business
Law (GBL) to mandate certain privacy settings by default for child users
under the age of 18. The mandate would apply to all social media plat-
forms covered under NYCOSA, defined as digital platforms which host
user-generated content, allow users to construct a public or semi-public
profile, and allow users to directly message each other as a significant
part of the provision of such platform. The New York State Attorney
General would be empowered to further define scope in regulations
promulgated pursuant to this act, just as they are already doing for
similar statutes such as the SAFE for Kids Act (Article 45 of GBL).
All social media platforms under NYCOSA would be required to turn off
open chat functions, which allow adults to instantly and privately
communicate with child users whether.or not they know such child or have
been previously connected. Unconnected users would also be barred from
viewing the profile of a child user, tagging them in a post, or sending
them digital currency. Parents would be able to override these default
privacy settings and switch to a different setting, however, if they so
choose. Parents would also be notified when a child user attempts to
change these settings on their own, at which point the parent would be
able to either approve or deny the change.
These contact settings would notably not apply to connections which a
parent and/or child has previously approved by accepting a friend
request. For all child users under the age of 13, parents would have to
approve incoming friend requests and would also be able to view the list
of their child's current friends. For child users 13 and over, the child
themself can approve friend requests and the parent is not granted this
visibility.
For all users under the age of 18, parents would also be required to
approve all financial transactions related to their child's account.
Social media platforms must set up a mechanism by which a parent can
view financial transactions of a child user's account at any time.
Social media operators would be required to undergo commercially reason-
able age verification to determine which of their users is a minor
covered by the provisions of the bill, which many social media platforms
are already required to conduct under Article 45 of GBL as well as vari-
ous laws in other countries and states (Sakasegawa, J.
(2024, August 29). The state of age verification in social media: an
overview Persona. https://withpersona.com/blog/a ge-verification-in-so-
cial-media).
Operators would be barred from deploying dark patterns, defined as any
mechanism or design on a platform which intentionally inhibits user
choice and/or autonomy, in order to prevent any user or their parent
from exercising their rights under this article. One example of a dark
pattern in the context of the New York Children's Online Safety Act
might be a mechanism that technically allows parents to view their
under-13 child's connected accounts and financial transactions but is so
difficult to access that it is essentially useless. Operators would not
be able to induce parents to change the required privacy settings in
this bill by, for example, degrading the quality or increasing the price
of the platform. Enforcement against violations of the bill would be
vested in the New York State Attorney General, who would be empowered to
pursue damages of $5,000 per violation.
Section three of this bill is a severability clause.
Section four of this bill sets the effective date.
 
JUSTIFICATION::
Child safety experts estimate that there are approximately 500,000
online predators active on any given day. According to the FBI, over 50%
of the victims of online sexploitation are between the ages of 12 and
15, and an estimated 89% of sexual advances occur in Internet chatrooms
or through instant messaging (Kraut, M. E. (2024). Children and Grooming
/ Online Predators I Child Crime Prevention & Safety Center.
Childsafetylosangelescriminallawyer.pro; Child Crime Prevention & Safety
Center). Fifty-eight percent of parents report being concerned about
online predation, yet only seven percent of the targets of such behavior
were aware that their children had received inappropriate content from
an adult. Forty percent of children in grades four through eight report
chatting online with a stranger, and Internet use amongst three to four-
year-olds has doubled within the last five years (Lazic, M. (2023, May
19). How Many Predators are Online Each Day?  
Online Predators Statis-
tics Legaljobs.io).
Virtual platforms like Facebook, Instagram, Snapchat, TikTok, X, and
Roblox, where adult users can collect vast troves of information about
child users and lure them into private chats within minutes, have become
veritable hunting grounds for pedophiles in the modern era. Over 80% of
child sex crimes can be traced back to social media, and reports of
online child exploitation surged by a staggering 106% in the early days
of the COVID-19 lockdown when many households moved online ((Lazic, M.
(2023, May 19). How Many Predators are Online Each Day? (Online Preda-
tors Statistics). Legaljobs.io). Many platforms have thus taken the
responsible step of creating certain "privacy by default" settings for
users under a certain age, meaning that the strictest possible privacy
settings are applied without manual input. Such settings limit which
types of adult users can message and tag underage accounts.
Despite these efforts, however, critical gaps in the online safety net
remain: platforms turn a blind eye to the millions of underage users who
lie about their age to create an account, bolstered by the 26-year-old
federal Children's Online Privacy and Protection Act
(COPPA) which only holds them liable if they have "actual knowledge"
that a user is under the age of 13 - a high legal bar which is virtually
impossible to clear in court. The Federal Trade Commission
(FTC) openly admits that there is nothing in COPPA to prevent users from
lying about their age, assuring companies that they need only establish
a date-of-birth portal for users to self-report age - despite such
portals' notorious unreliability (Federal Trade Commission. (2020, July
20). Complying with COPPA: Frequently asked questions. Federal Trade
Commission.).
Furthermore, even where a platform does know a user's age, not all have
chosen to deploy privacy by default for minors' accounts. The popular
gaining app Roblox, for example, the subject of several sweeping press
investigations, boasts an open chat function wherein a gamer of any age
can post anything they want in a game chat and privately message other
users. Highly contentious amongst child safety experts, Roblox's open
chat leaves it "to parents to activate child safety features such as
restricting what categories of people their kids can talk to, or which
games they can play. If parents don't, children can introduce themselves
to any stranger in a game, chat for hours and accept requests to
converse in private messages."
(Carville, 0., & D'Anastasio, C. (2024, July 22). Roblox Is Fighting to
Keep Pedophiles Away and Not Always Winning. Bloomberg.com.). In Novem-
ber 2023, Roblox announced it would be launching a new feature, Roblox
Connect, that enables users as young as 13 to initiate avatar voice
calls, complete with facial motion tracking technology, with any other
user - despite 13-year-olds being the prime target demographic for
online predators (Hatmaker, T. (2023, September 8). Roblox is launching
avatar-based voice calls with facial motion tracking Tech Crunch. Tech-
Crunch.). Roblox Connect was immediately panned by the National Center
on Sexual Exploitation, which pointed out that such voice chats are one
of the primary methods by which online predators groom their victims,
establishing emotional connections via an impossible-to-monitor medium
with the intent of gaining their trust.
Features such as open chat and Roblox Connect have not escaped the
attention of the Internet's least savory characters: in 2023, Roblox
reported 13,316 instances of child exploitation to the National Center
for Missing & Exploited Children and responded to 1,300 requests for
information from law enforcement (Carville, O., & D'Anastasio, C. (2024,
July 22). Roblox Is Fighting to Keep Pedophiles Away and Not Always
Winning. Bloomberg.com.). With little to no barrier to entry (Roblox
allows users to sign up without emails or parental permission), child
users on the platform can find games revolving around sex and virtual
"strip clubs" within minutes
(Roblox: A Mainstream Contributor to Sexual Exploitation. (ltd.).
National Center on Sexual Exploitation.
https://eadsexualexploitation.org/roblox/). An October 2024 study by the
investment research firm Hindenburg Research LLC found that researchers
were unable to create a test account with the name "Jeffrey Epstein" as
it, along with more than 900 variations, was already taken. ("Roblox:
Inflated Key Metrics for Wall Street and a Pedophile Hellscape for Kids
Hindenburg Research." Hindenburgresearckcom, 8 Oct. 2024,
hindenburgresearchcom/roblox/). Usernames were also taken, for anther
notorious child abuser, Earl Brian Bradley, who was indicted on 471
charges of molesting, raping, and exploiting 103 children, and research-
ers were able to access games like "Escape to Epstein Island" and over
600 games involving the term "Diddy" (i.e. "Run From Diddy Simulator,"
"Diddy Party") within minutes, despite having registered as a child
under the age of 13. Roblox's pedophile problem is so severe, in fact,
that a short seller in 2023 was able to drop the company's share price
eight percent simply by publishing a blog post aggregating all of the
arrests linked to the site.
This bill, known as the New York Childrens Online Safety Act (NYCOSA),
requires social media and gaming platforms that feature user-to-user
messaging to undertake several common sense steps to better protect kids
online. Firstly, it requires them to turn off open chat functions by
default for any user under the age of 18, unless a parent switches them
back on. Adult users can only message child users if their friend
requesthas been previously accepted - which, for users under the age of
13, will require parental approval. Parents would also be required to
approve financial transactions connected to a minor's account, as the
exchange of digital forms of currency, such as Roblox's "Robux," has
featured prominently in nearly every case of sexual assault and abuse
connected to the app. Parents would also be able to view a list of
recent financial transactions connected to an account and, for users
under the age of 13, a list of current friends. This would not only
assist parents in being able to identify and report early 'stages of
predatory behavior, but would also deter future sexploitation from pred-
ators who know that their interactions with their next child victim will
be closely watched. Violations of NYCOSA would be enforced by the Office
of the Attorney General, which is well-equipped to investigate allega-
tions of misconduct through its Bureau of Internet and Technology. The
remedy created by NYCOSA is the same prescribed by the Federal Trade
Commission in its 2022 settlement against Epic Games, Inc., the creator
of video game Fortnite, which similarly found that Fortnite's live
on-by-default text and voice communications had put children and teens
at serious risk (United States of America v. Epic Games, Inc. 16 Dec.
2022,
www.ftc.gov/system/files/ftc_gov/pdf/2223087EpicGamesSettlement.pdf.
p. 17.).
In mandating common sense measures to protect child safety, many of
which have already been adopted by the world's leading social media
platforms, this bill would send a clear message that New York has zero
tolerance for platforms that prioritize daily active user count at the
expense of kids' safety. It is the least we can do to ensure safer
digital spaces for the most vulnerable among us.
 
PRIOR LEGISLATIVE HISTORY::
This is a new bill in the Assembly.
 
FISCAL IMPACT ON THE STATE:
To be determined.
 
EFFECTIVE DATE::
This act shall take effect on the one hundred eightieth day after the
office of the attorney general shall promulgate rules and regulations
necessary to effectuate the provisions of this act.
STATE OF NEW YORK
________________________________________________________________________
6549
2025-2026 Regular Sessions
IN ASSEMBLY
March 6, 2025
___________
Introduced by M. of A. ROZIC -- read once and referred to the Committee
on Consumer Affairs and Protection
AN ACT to amend the general business law, in relation to establishing
the New York children's online safety act
The People of the State of New York, represented in Senate and Assem-bly, do enact as follows:
1 Section 1. Short title. This act shall be known and may be cited as
2 the "New York children's online safety act".
3 § 2. The general business law is amended by adding a new article 45-A
4 to read as follows:
5 ARTICLE 45-A
6 NEW YORK CHILDREN'S ONLINE SAFETY ACT
7 Section 1509. Definitions.
8 1510. Privacy by default.
9 1511. Parental approvals.
10 1512. Prohibition on dark patterns.
11 1513. Nondiscrimination.
12 1514. Scope.
13 1515. Rulemaking authority.
14 1516. Remedies.
15 § 1509. Definitions. For the purposes of this article, the following
16 terms shall have the following meanings:
17 1. "Connected" and variations thereof shall mean that a covered minor
18 and/or such covered minor's parent has previously approved a connection
19 with another user such that such other user may privately contact the
20 covered minor.
21 2. "Covered minor" shall mean any user who is determined by an opera-
22 tor, via one or more commercially reasonable age verification methods,
23 to be under the age of eighteen.
EXPLANATION--Matter in italics (underscored) is new; matter in brackets
[] is old law to be omitted.
LBD07773-02-5
A. 6549 2
1 3. "Financial transaction" shall mean a transaction between users
2 involving any type of currency, including virtual currency used within a
3 covered platform whether or not it can be converted to fiat money.
4 4. "Operator" shall mean any person, business, or other legal entity
5 who operates or provides a covered platform.
6 5. "Parent" shall mean a parent or legal guardian.
7 6. "Covered platform" shall mean a public or semi-public website,
8 online service, online application, or mobile application that (a) is
9 used by a covered minor in this state, (b) allows users to construct a
10 public or semi-public profile for the purposes of using such website,
11 service, or application, (c) allows users to create or post content that
12 is viewable by other users, including but not limited to, on message
13 boards, in chat rooms, or through a landing page or main feed that
14 presents the user with content generated by other users, and (d) allows
15 users to privately message each other as a significant part of the
16 provision of such website, service, or application.
17 7. "Tag" shall mean when a user identifies a second user in posted
18 content in a manner that links to the second user's profile.
19 8. "User" shall mean a user of a covered platform in New York not
20 acting as an operator, or agent or affiliate of such operator, of such
21 platform or any portion thereof.
22 § 1510. Privacy by default. 1. No operator shall offer a covered plat-
23 form in this state without conducting commercially reasonable age
24 verification to determine whether a user is a covered minor. The attor-
25 ney general shall promulgate regulations identifying methods for commer-
26 cially reasonable and technically feasible age verification, which shall
27 consider the size, financial resources, and technical capabilities of
28 covered platforms, the costs and effectiveness of available age determi-
29 nation techniques for users of such platforms, the audience of such
30 platforms, and prevalent practices of the industry of the operator. Such
31 regulations shall also identify the appropriate levels of accuracy that
32 would be considered commercially reasonable and technically feasible for
33 operators to achieve in determining whether a user is a covered minor.
34 2. For all users determined under a commercially reasonable age
35 verification method by an operator to be a covered minor, such operator
36 shall utilize the following settings by default for covered minors,
37 which shall ensure that no user who is not already connected to a
38 covered minor may:
39 (a) communicate directly and privately with such minor;
40 (b) view the profile of such minor;
41 (c) tag such minor in posted content; and/or
42 (d) engage in a financial transaction with such minor.
43 3. A parent of a covered minor may override the default privacy
44 settings provided in subdivision two of this section at such parent's
45 discretion.
46 4. An operator shall notify a parent of a covered minor whenever such
47 covered minor attempts to change the default settings provided in subdi-
48 vision two of this section. The parent may then either approve or deny
49 the request to change the settings for such minor.
50 § 1511. Parental approvals. 1. For all users determined under a
51 commercially reasonable age verification method by an operator to be a
52 covered minor under the age of thirteen, such operator shall require the
53 parent of such covered minor to approve all new connections with such
54 covered minor before such covered minor's and such other user's accounts
55 may be connected. For covered minors under the age of thirteen, an
56 operator shall also establish a mechanism by which a parent of such
A. 6549 3
1 minor may easily view the list of all users currently connected with the
2 account of the minor.
3 2. For all users determined under a commercially reasonable age
4 verification method by an operator to be a covered minor, such operator
5 shall require a parent to approve all financial transactions relating to
6 such covered minor's account. Such operator shall further establish a
7 mechanism by which a parent of a covered minor may easily view a history
8 of all financial transactions relating to such covered minor's account
9 at any time.
10 § 1512. Prohibition on dark patterns. It shall be unlawful for a
11 covered platform to deploy any mechanism or design which intentionally
12 inhibits the purpose of this article, subverts user and/or parent choice
13 or autonomy, or renders it more difficult for a user and/or parent to
14 exercise any of the prescribed rights and/or privileges provided in this
15 article.
16 § 1513. Nondiscrimination. An operator shall not withhold, degrade,
17 lower the quality of, or increase the price of any product, service, or
18 feature of a covered platform, other than as necessary for compliance
19 with the provisions of this article or any rules or regulations promul-
20 gated pursuant to this article, to a user due to such operator being
21 required to establish the settings and approvals provided in sections
22 fifteen hundred ten and fifteen hundred eleven of this article.
23 § 1514. Scope. 1. This article shall apply to conduct that occurs in
24 whole or in part in New York. For purposes of this article, conduct
25 takes place wholly outside of New York if the covered platform is
26 accessed by a user who is physically located outside of New York.
27 2. Nothing in this article shall be construed to impose liability for
28 commercial activities or actions by operators subject to 15 U.S.C. §
29 6501 that is inconsistent with the treatment of such activities or
30 actions under 15 U.S.C. § 6502.
31 § 1515. Rulemaking authority. The attorney general shall promulgate
32 such rules and regulations as are necessary to effectuate and enforce
33 the provisions of this article.
34 § 1516. Remedies. 1. On or after the effective date of this article,
35 whenever it appears to the attorney general, upon complaint or other-
36 wise, that any person, within or outside the state, has violated the
37 provisions of this article, the attorney general may bring an action or
38 special proceeding in the name and on behalf of the people of the state
39 of New York to enjoin any such violation, to obtain restitution of any
40 moneys or property obtained directly or indirectly by any such
41 violation, to obtain disgorgement of any profits or gains obtained
42 directly or indirectly by any such violation, to obtain damages caused
43 directly or indirectly by any such violation, to obtain civil penalties
44 of up to five thousand dollars per violation, and to obtain any such
45 other and further relief as the court may deem proper, including prelim-
46 inary relief.
47 2. The attorney general shall maintain a website to receive
48 complaints, information, and/or referrals from members of the public
49 concerning an operator's or covered platform's alleged compliance or
50 noncompliance with the provisions of this article.
51 § 3. Severability. If any clause, sentence, paragraph, subdivision,
52 section or part of this act shall be adjudged by any court of competent
53 jurisdiction to be invalid, such judgment shall not affect, impair, or
54 invalidate the remainder thereof, but shall be confined in its operation
55 to the clause, sentence, paragraph, subdivision, section or part thereof
56 directly involved in the controversy in which such judgment shall have
A. 6549 4
1 been rendered. It is hereby declared to be the intent of the legislature
2 that this act would have been enacted even if such invalid provisions
3 had not been included herein.
4 § 4. This act shall take effect on the one hundred eightieth day after
5 the office of the attorney general shall promulgate rules and regu-
6 lations necessary to effectuate the provisions of this act; provided
7 that the office of the attorney general shall notify the legislative
8 bill drafting commission upon the occurrence of the enactment of the
9 rules and regulations necessary to effectuate and enforce the provisions
10 of section two of this act in order that the commission may maintain an
11 accurate and timely effective data base of the official text of the laws
12 of the state of New York in furtherance of effectuating the provisions
13 of section 44 of the legislative law and section 70-b of the public
14 officers law. Effective immediately, the addition, amendment and/or
15 repeal of any rule or regulation necessary for the implementation of
16 this act on its effective date are authorized to be made and completed
17 on or before such effective date.