E Point Perfect
Law \ Legal

NYC Passes Ordinance Restricting Use of AI in Employment Applicant Screening


We recently wrote about the Equal Employment Opportunity Commission’s Guidance on the use of Artificial Intelligence in the hiring process.  AI is exploding.  It is being put to use in many different industries and toward many different applications, including in various human resources related functions.  As we previously discussed, the use of AI can be extremely helpful to employers, if done properly.

Recently, New York City became one of the first jurisdictions to address the proliferation of AI tools in the employment context.  NYC passed a local ordinance prohibiting employers from using AI tools, which it calls Automated Employment Decision Tools, unless the employer takes certain steps to “vet” and disclose use of the tool.

The local law goes into effect on January 1, 2023 and will apply to employers operating in New York City that target NYC residents.  It will require that employers subject the AI screening tool to a validation test.  The ordinance calls the test a bias audit.  The bias audit must be conducted by an impartial auditor and will require evaluation of the tool’s potential disparate impact on protected traits.  In addition, the employer must post a summary of the bias audit on the employer’s web site.

Employers using Automated Employment Decision Tools will also need to provide certain notices to applicants.  Employers will need to notify applicants at least 10 days in advance that the AI tool will be used and must allow the applicant to request a different testing/screening method.  The employer must also give notice of the characteristics that the AI tool will assess, the types of information that will be collected, and how that information will be retained by the employer.

This approach seeks to directly address two of the most significant concerns with the use of AI tools in the screening process: (1) the potential to accidently or unintentionally screen out applicants based on a protected trait (disparate impact); and (2) potential disability-related accommodation obligations in the hiring process.  Recall that we took a close look at these risks in our previous post.  The ordinance also goes further to address applicant privacy concerns as well.

Like many new developments, employers relying on tried-and-true HR best practices will be ahead of the curve.  That’s because it has long been a best practice to require any pre-employment screening tool to be properly validated to protect against disparate impact concerns.  Organizational Psychologists have been talking about this type of validation for years. Many reputable third-party service providers who offer such applicant screening tools have already conducted similar validation tests or bias audits.

While this may be the first, we certainly do not think that the NYC ordinance is the last local or state law to address the use of AI in the hiring process.  Employers adopting these tools will need to be ready to comply with this changing legal landscape, as the AI tools themselves continue to develop and increase in popularity.



Source link

Related posts

Interesting Eight Circuit panel ruling rejecting district judge’s refusal to dismiss counts in plea process

Dawn Zuniga

Lawyer Payment Plans: How and Why to Implement Them in Your Firm

Dawn Zuniga

Hunton Insurance Group Advises Policyholders on Issues That Arise With Insurance Coverage for Digital Assets, Specifically Cryptocurrency and NFTs — A Seven-Part Series 

Dawn Zuniga

Federal Circuit: AI Cannot Be a Named ‘Inventor’ Under the Patent Act

Dawn Zuniga

New Study on Heavy Metal Levels in Baby Foods

Dawn Zuniga

UBO Act Amendment and Trade Registry Representative

Dawn Zuniga