• NYC Local Law 144 is effective January 1st, 2023, with enforcement beginning July 5th, 2023, and aims to detect bias in hiring and employment decisions from automated employment decision tools (AEDTs)

    • Companies who use AEDTs covered under the NYC ai hiring law will need to perform independent audits on those tools and post the results to a public website

    • Proceptual can help make the process easy, and we have a structured methodology to help you comply quickly

NYC Local Law 144 (NYC LL 144) is one of the first laws regulating automated employment tools.  The goal is to prevent bias in the hiring process.

There is some lingering ambiguity about some of the terminology found within the legislation, and this post aims to demystify how organizations should approach compliance.  We will continue to update via our newsletter as the regulations evolve.

Why NYC LL 144 is unique

NYC LL 144 specifies that it is regulating “Automated Employment Decision Tools”.

AEDTs refer to the following:

Any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons. The term “automated employment decision tool” does not include a tool that does not automate, support, substantially assist or replace discretionary decision-making processes and that does not materially impact natural persons, including, but not limited to, a junk email filter, firewall, antivirus software, calculator, spreadsheet, database, data set, or other compilation of data.

You will notice that neither artificial intelligence nor machine learning are specifically mentioned in this description – and this is evident in other emerging regulations that are covering those technologies, while leaving the scope broad enough to consider other technologies.

According to the text, a bias audit is an:

impartial evaluation by an independent auditor. Such bias audit shall include but not be limited to the testing of an automated employment decision tool to assess the tool’s disparate impact on…. [covered people]

Applicants must also be notified that these tools will be used beforehand, and once a bias audit is complete, the results must be posted publicly on the web.

Failure to comply could result in fines of up to $1,500 per day, per occurrence.

It is also important to note that the law is officially in effect as of January 1st, 2023, but enforcement has been postponed until July 5th, 2023.  Companies who are required to have bias audits but have yet to perform them are technically not in compliance, although the enforcement itself has been postponed.

How to become compliant

Proceptual makes AI compliance easy, breaking it down into several steps::

    1. Complete Proceptual’s quick Compliance Assessment to determine if you are in need of an independent audit.
    2. If you need an AI audit, we have a simple, fast, comprehensive process to provide you with the independent audit you need.
    3. Experts will review the data with you upon completion, and before posting any results.
    4. We will post the report on a webpage hosted on our site, and provide you with a link that you can use on your job boards, job listings, and other necessary places.
    5. Based on your goals, we can provide advisory or training services to address any areas of opportunity you may need help with.

          Contact our team for more information at ken@proceptual.com.