Santabarbara, Reyes, Hevesi, Rosenthal L, Slater, Lee
 
MLTSPNSR
 
Add Art 4 §§401 - 404, St Tech L
 
Enacts the legislative oversight of automated decision-making in government act (LOADinG Act) to regulate the use of automated decision-making systems and artificial intelligence techniques by state agencies.
NEW YORK STATE ASSEMBLY MEMORANDUM IN SUPPORT OF LEGISLATION submitted in accordance with Assembly Rule III, Sec 1(f)
 
BILL NUMBER: A9430B
SPONSOR: Otis
 
TITLE OF BILL:
An act to amend the state technology law, in relation to automated deci-
sion-making by state agencies
 
PURPOSE OR GENERAL IDEA OF BILL:
This bill would require the disclosure and reporting of impact of auto-
mated decision-making systems in use by state agencies or entities on
behalf of an agency in performing any function related to public assist-
ance benefits or impacting the rights and liberties of individuals.
 
SUMMARY OF PROVISIONS:
Section 1. Provides the title, which is the "legislative oversight of
automated decision-making in government act (LOADInG Act)."
Section 2. Adds new article 4 to the technology law to require any state
agency or entity acting on behalf of any agency using, automated deci-
sion-making systems the utilizes meaningful human review in relation to
the delivery of any public assistance benefit or in circumstances that
impact the rights, civil liberties, safety, or welfare of an individual
unless such utilization is authorized in law. Further prohibits the
procurement or acquisition of any service or system relying on automated
decision-making systems except when such system is subject to meaningful
human review. Each agency utilizing automated decision-making systems in
relation to delivery of public assistance benefit or in decision-making
involving individual protected rights is required to complete an impact
assessment once every two years that includes a description of the
objectives and the development of the technology and testing of its
accuracy, fairness, and possible biases that could lead to discriminato-
ry outcomes. If such impact assessment finds that the automated deci-
sion-making system produces discriminatory or biased outcomes, the agen-
cy shall not use the system. The impact assessment would be made public
on the agency website and sent to the governo r the temporary president
of the senate, and the speaker of the assembly at least 30 days prior to
any use or implementation. State agencies will have the ability to
redact information from their impact assessment on the website, if the
automated decision-making system is believed to include technology that
will protect security systems as long as they publish an explanation
with the redacted impact assessment.
Section 3. Requires disclosure of certain information about automated
decision-making systems already in use by any state agency, including a
description of the system, software vendors related to the system, the
date use began, the purpose and use of the system, whether any impact
assessments have been carried out, and any other information that the
agency deems necessary.
Section 4. Effective date
 
JUSTIFICATION:
In New York State and worldwide, artificial intelligence, algorithms,
and computational modeling services have recently exploded onto the
market and have become fully-integrated tools for many individuals and
companies alike, often automating certain historically human-conducted
decision-making tasks and functions to boost productivity. However,
these systems come with significant risks to security and privacy,
unknown malfunctions, and the demonstrated potential for discrimination
to occur due to biases in their development and operation. As these
technologies continue to mature, more use cases evolve, and direct costs
of use diminish, it is paramount that the duties of the state to the
public are reaffirmed, especially when individuals' constitutional and
statutory rights are involved.
This legislation sets a necessary prohibition on the unauthorized use of
these services by state agencies, and, when authorized, requires the
publication of an impact assessment detailing the development of the
service and requires thorough testing of its biases and discrimination
against actual or perceived race, color, ethnicity, religion, national
origin, sex, gender, gender identity, sexual orientation, familial
status, biometric information, lawful source of income, or disability.
Finally, the bill prohibits the state from using discriminatory auto-
mated decision-making systems.
To minimize disruption of state operations that already rely on these
technologies, existing automated decision-making systems must be
disclosed to the legislature within one year. After one year, existing
uses will be subject to the same requirements as any new automated deci-
sion-making system.
 
PRIOR LEGISLATIVE HISTORY:
2023: S7543 - Referred to Rules
 
FISCAL IMPLICATIONS FOR STATE AND LOCAL GOVERNMENTS:
To be determined.
 
EFFECTIVE DATE:
This act shall take effect immediately, provided that section two of
this act shall take effect one year after it shall have become a law.
STATE OF NEW YORK
________________________________________________________________________
9430--B
IN ASSEMBLY
March 14, 2024
___________
Introduced by M. of A. OTIS, SANTABARBARA, REYES, HEVESI, L. ROSENTHAL,
SLATER -- read once and referred to the Committee on Science and Tech-
nology -- committee discharged, bill amended, ordered reprinted as
amended and recommitted to said committee -- reported and referred to
the Committee on Rules -- Rules Committee discharged, bill amended,
ordered reprinted as amended and recommitted to the Committee on Rules
AN ACT to amend the state technology law, in relation to automated deci-
sion-making by state agencies
The People of the State of New York, represented in Senate and Assem-bly, do enact as follows:
1 Section 1. Short title. This act shall be known and may be cited as
2 the "legislative oversight of automated decision-making in government
3 act (LOADinG Act)".
4 § 2. The state technology law is amended by adding a new article 4 to
5 read as follows:
6 ARTICLE IV
7 AUTOMATED DECISION-MAKING IN STATE GOVERNMENT
8 Section 401. Definitions.
9 402. Use of automated decision-making systems by agencies.
10 403. Impact assessments.
11 404. Submission to the governor and legislature.
12 § 401. Definitions. For the purpose of this article:
13 1. "Automated decision-making system" shall mean any software that
14 uses algorithms, computational models, or artificial intelligence tech-
15 niques, or a combination thereof, to automate, support, or replace human
16 decision-making and shall include, without limitation, systems that
17 process data, and apply predefined rules or machine learning algorithms
18 to analyze such data, and generate conclusions, recommendations,
19 outcomes, assumptions, projections, or predictions without meaningful
20 human discretion. "Automated decision-making system" shall not include
21 any software used primarily for basic computerized processes, such as
22 calculators, spellcheck tools, autocorrect functions, spreadsheets,
23 electronic communications, or any tool that relates only to internal
24 management affairs such as ordering office supplies or processing
EXPLANATION--Matter in italics (underscored) is new; matter in brackets
[] is old law to be omitted.
LBD11734-10-4
A. 9430--B 2
1 payments, and that do not materially affect the rights, liberties, bene-
2 fits, safety or welfare of any individual within the state.
3 2. "Meaningful human review" means review, oversight and control of
4 the automated decision-making process by one or more individuals who
5 understand the risks, limitations, and functionality of, and are trained
6 to use, the automated decision-making system and who have the authority
7 to intervene or alter the decision under review, including but not
8 limited to the ability to approve, deny, or modify any decision recom-
9 mended or made by the automated system.
10 3. "State agency" shall mean any department, public authority, board,
11 bureau, commission, division, office, council, committee or officer of
12 the state. Such terms shall not include the legislature or judiciary.
13 4. "Public assistance benefit" shall mean any service or program with-
14 in the control of the state, or benefit provided by the state to indi-
15 viduals or households, including but not limited to public assistance,
16 cash assistance, grants, child care assistance, housing assistance,
17 unemployment benefits, transportation benefits, education assistance,
18 domestic violence services, and any other assistance or benefit within
19 the authority of the state to grant to individuals within the state.
20 This shall not include any federal program that is administered by the
21 federal government or the state.
22 § 402. Use of automated decision-making systems by agencies. 1. No
23 state agency, or any entity acting on behalf of such agency, which
24 utilizes or applies any automated decision-making system, directly or
25 indirectly, in performing any function that: (a) is related to the
26 delivery of any public assistance benefit; (b) will have a material
27 impact on the rights, civil liberties, safety or welfare of any individ-
28 ual within the state; or (c) affects any statutorily or constitutionally
29 provided right of an individual, shall utilize such automated decision-
30 making system, unless such automated decision-making system is subject
31 to continued and operational meaningful human review.
32 2. No state agency shall authorize any procurement, purchase or acqui-
33 sition of any service or system utilizing, or relying on, automated
34 decision-making systems in performing any function that is: (a) related
35 to the delivery of any public assistance benefit; (b) will have a mate-
36 rial impact on the rights, civil liberties, safety or welfare of any
37 individual within the state; or (c) affects any statutorily or constitu-
38 tionally provided right of an individual unless such automated deci-
39 sion-making system is subject to continued and operational meaningful
40 human review.
41 3. The use of an automated decision-making system shall not affect (a)
42 the existing rights of employees pursuant to an existing collective
43 bargaining agreement, or (b) the existing representational relationships
44 among employee organizations or the bargaining relationships between the
45 employer and an employee organization. The use of an automated deci-
46 sion-making system shall not result in the: (1) discharge, displacement
47 or loss of position, including partial displacement such as a reduction
48 in the hours of non-overtime work, wages, or employment benefits, or
49 result in the impairment of existing collective bargaining agreements;
50 (2) transfer of existing duties and functions currently performed by
51 employees of the state or any agency or public authority thereof to an
52 automated decision-making system; or (3) transfer of future duties and
53 functions ordinarily performed by employees of the state or any agency
54 or public authority. The use of an automated decision-making system
55 shall not alter the rights or benefits, and privileges, including but
56 not limited to terms and conditions of employment, civil service status,
A. 9430--B 3
1 and collective bargaining unit membership status of all existing employ-
2 ees of the state or any agency or public authority thereof shall be
3 preserved and protected.
4 § 403. Impact assessments. 1. State agencies seeking to utilize or
5 apply an automated decision-making system permitted under section four
6 hundred two of this article with continued and operational meaningful
7 human review shall conduct or have conducted an impact assessment
8 substantially completed and bearing the signature of one or more indi-
9 viduals responsible for meaningful human review for the lawful applica-
10 tion and use of such automated decision-making system. Following the
11 first impact assessment, an impact assessment shall be conducted in
12 accordance with this section at least once every two years. An impact
13 assessment shall be conducted prior to any material change to the auto-
14 mated decision-making system that may change the outcome or effect of
15 such system. Such impact assessments shall include:
16 (a) a description of the objectives of the automated decision-making
17 system;
18 (b) an evaluation of the ability of the automated decision-making
19 system to achieve its stated objectives;
20 (c) a description and evaluation of the objectives and development of
21 the automated decision-making including:
22 (i) a summary of the underlying algorithms, computational modes, and
23 artificial intelligence tools that are used within the automated deci-
24 sion-making system; and
25 (ii) the design and training data used to develop the automated deci-
26 sion-making system process;
27 (d) testing for:
28 (i) accuracy, fairness, bias and discrimination, and an assessment of
29 whether the use of the automated decision-making system produces discri-
30 minatory results on the basis of a consumer's or a class of consumers'
31 actual or perceived race, color, ethnicity, religion, national origin,
32 sex, gender, gender identity, sexual orientation, familial status, biom-
33 etric information, lawful source of income, or disability and outlines
34 mitigations for any identified performance differences in outcomes
35 across relevant groups impacted by such use;
36 (ii) any cybersecurity vulnerabilities and privacy risks resulting
37 from the deployment and use of the automated decision-making system, and
38 the development or existence of safeguards to mitigate the risks;
39 (iii) any public health or safety risks resulting from the deployment
40 and use of the automated decision-making system;
41 (iv) any reasonably foreseeable misuse of the automated decision-mak-
42 ing system and the development or existence of safeguards against such
43 misuse;
44 (e) the extent to which the deployment and use of the automated deci-
45 sion-making system requires input of sensitive and personal data, how
46 that data is used and stored, and any control users may have over their
47 data; and
48 (f) the notification mechanism or procedure, if any, by which individ-
49 uals impacted by the utilization of the automated decision-making system
50 may be notified of the use of such automated decision-making system and
51 of the individual's personal data, and informed of their rights and
52 options relating to such use.
53 2. Notwithstanding the provisions of this article or any other law, if
54 an impact assessment finds that the automated decision-making system
55 produces discriminatory or biased outcomes, the state agency shall cease
A. 9430--B 4
1 any utilization, application, or function of such automated decision-
2 making system, and of any information produced using such system.
3 § 404. Submission to the governor and legislature. 1. Each impact
4 assessment conducted pursuant to this article shall be submitted to the
5 governor, the temporary president of the senate, and the speaker of the
6 assembly at least thirty days prior to the implementation of the auto-
7 mated decision-making system that is the subject of such assessment.
8 2. (a) The impact assessment of an automated decision-making system
9 shall be published on the website of the relevant state agency.
10 (b) If the state agency makes a determination that the disclosure of
11 any information required in the impact assessment would result in a
12 substantial negative impact on health or safety of the public, infringe
13 upon the privacy rights of individuals, or significantly impair the
14 state agency's ability to protect its information technology or opera-
15 tional assets, such state agency may redact such information, provided
16 that an explanatory statement on the process by which the state agency
17 made such determination is published along with the redacted impact
18 assessment.
19 (c) If the impact assessment covers any automated decision-making
20 system that includes technology that is used to prevent, detect, protect
21 against or respond to security incidents, identity theft, fraud,
22 harassment, malicious or deceptive activities or other illegal activity,
23 preserve the integrity or security of systems, or to investigate,
24 report or prosecute those responsible for any such malicious or decep-
25 tive action, such state agency may redact such information for the
26 purposes of this subdivision, provided that an explanatory statement on
27 the process by which the state agency made such determination is
28 published along with the redacted impact assessment.
29 § 3. Disclosure of existing automated decision-making systems. Any
30 state agency, that directly or indirectly, utilizes an automated deci-
31 sion-making system, as defined in section 401 of the state technology
32 law, shall submit to the legislature a disclosure on the use of such
33 system, no later than one year after the effective date of this section.
34 Such disclosure shall include:
35 (a) a description of the automated decision-making system utilized by
36 such agency;
37 (b) a list of any software vendors related to such automated deci-
38 sion-making system;
39 (c) the date that the use of such system began;
40 (d) a summary of the purpose and use of such system, including a
41 description of human decision-making and discretion supported or
42 replaced by the automated decision-making system;
43 (e) whether any impact assessments for the automated decision-making
44 system were conducted and the dates and summaries of the results of such
45 assessments where applicable; and
46 (f) any other information deemed relevant by the agency.
47 § 4. This act shall take effect immediately, provided that section two
48 of this act shall take effect one year after it shall have become a law.