Beware: A.I. May Inadvertently Discriminate Against Job Applicants


Two federal agencies are cautioning employers to take a closer look at how they use artificial intelligence in hiring. Despite its productivity promises in recruiting and hiring, A.I. is running into some legal risks.  

The Department of Justice and the Equal Employment Opportunity Commission recently sent out separate notices in mid-May warning that businesses that use A.I. tools could potentially violate the Americans with Disabilities Act, part of which protects people with disabilities from workplace discrimination. 

Employers have increasingly turned to A.I. to source new job candidates, screen resumes, and streamline the interview process. But suppose a digital tool kicks out an applicant–either intentionally or unintentionally–because of their disability. In that case, employers risk running afoul of the law, assuming that the individual could perform the job with a reasonable accommodation. That could also be applicable in a case where a chatbot boots an applicant because of an employment gap that was caused by the need to take time off to recover from surgery.

“You don’t want to screen someone out of a job if the thing that’s causing them to not meet your criteria in the application process is something that, with an accommodation, they’d be able to perform on the job,” explains David Baron, a labor and employment attorney at the London-based law firm, Hogan Lovells.

If an individual with a disability either requests or needs a reasonable accommodation to apply for a job, or do the job itself, then employers must meet that request to adhere to the ADA–so long as that accommodation does not create an undue hardship on the employer. Undue hardships are requests that would impose a significant difficulty or expense on an employer. Adjusting the height of a desk to accommodate an employee who uses a wheelchair is an example of a reasonable accommodation.

And employers are generally still on the hook even if the decision-making tool is administered by a third-party entity if that test is discriminatory. 

If you use a decision-making tool for hiring, Baron recommends that you communicate up front to applicants that reasonable accommodations are available. That could include an alternative format or test that’s available for those with disabilities. Communication is key here: providing as much information as possible about how the tools function, what they measure, and how assessments are made, could help decrease the chances of running afoul of the law.

Baron adds that employers should use tools to measure abilities or qualifications that are “truly essential to the job.” Take the case of a speechwriter–it would be unnecessary to screen for the ability to code in different programming languages if the job’s primary responsibilities are to work with the written word. 

Another best practice is to vet any potential new tools to ensure that a vendor factored in inclusivity when creating the tool. Don’t use digital tools without understanding their full capabilities, warned EEOC Chair Charlotte Burrows in a statement:  “If employers are aware of the ways AI and other technologies can discriminate against persons with disabilities, they can take steps to prevent it.” In other words, the onus is on you.

Source link


U.S. Drug Dealer Turned Master Falconer Extols ‘Healing Power’ of Wildlife

Previous article

Mellody Hobson Is About To Be The First Black Female Owner Of An NFL Team

Next article


Leave a reply

Your email address will not be published. Required fields are marked *