Techno Blender
Digitally Yours.

NYC-Based Companies Will Need to Audit AI Hiring Tools for Bias

0 53


On Wednesday, New York City imposed a new law regulating companies that use artificial intelligence to screen candidates for open positions. The city’s Automated Employment Decision Tools (AEDT) law requires companies to prove their AI tool is free from racial and gender bias before it can be used.

AEDT Law, also known as Local Law 144, was introduced in 2021 and took effect this week, making New York City the first to pave the way for corporate AI regulations with others expected to follow suit. The tools are algorithms that use AI to make decisions about who to hire and/or promote, although the filtered selection is reportedly passed to a human for final review.

Companies will need to annually file a ‘bias audit’ and if they don’t comply, first-time offenders will be subject to a $500 fine and repeat violations can carry up to $1,500 fines. If a company doesn’t comply with the bias audit, it will face a fine, per AI tool, of up to $1,500 each day, according to Conductor AI.

The bias audit will calculate the selection rate and impact ratio for each category of people who were hired including male versus female categories, race/ethnicity, and intersectional categories combining sex, ethnicity, and race, according to the law.

Although using AI tools can significantly cut back on employers wading through hundreds of resumes, the risk is that the tool could mirror human stereotypes and discriminate against certain candidates.

“That’s the risk in all of this, that left unchecked, humans sometimes can’t even explain what data points the algorithm is picking up on. That’s what was largely behind this legislation,” John Hausknecht, a professor of human resources at Cornell University’s School of Industrial and Labor Relations, told CBS News. “It’s saying let’s track it, collect data, analyze it, and report it, so over time, we can make changes to the regulations.”

According to AEDT, if applicable, a company must provide alternative instructions for an applicant to “request an alternative selection process or a reasonable accommodation under other laws,” although employers are not required to offer an alternative selection process.

“We are only talking about those tools that take the place of humans making decisions,” Camacho Moran an employment attorney at Farrell Fritz told CBS. “If you have an AI tool that runs through 1,000 applications and says, ‘these are the top 20 candidates,’ that is clearly a tool that falls within the definition of an AEDT.”

The law may spread to other cities as the draw toward remote hiring becomes increasingly popular for companies, both in New York City and elsewhere. But the law is still limited, Julia Stoyanovich, a computer science professor at New York University and a founding member of the city’s Automatic Decisions Systems Task Force, told NBC News. She told the outlet that AEDT still doesn’t cover some important categories of applicants including those based on age or disability discrimination.

“First of all, I’m really glad the law is on the books, that there are rules now and we’re going to start enforcing them,” Stoyanovich told the outlet. “But there are also lots of gaps. So, for example, the bias audit is very limited in terms of categories. We don’t look at age-based discrimination, for example, which in hiring is a huge deal, or disabilities.”

It is still unclear how the AEDT law will be enforced, but a spokesperson for New York’s Department of Consumer and Worker Protection said the agency will “collect and investigate complaints” against companies.


On Wednesday, New York City imposed a new law regulating companies that use artificial intelligence to screen candidates for open positions. The city’s Automated Employment Decision Tools (AEDT) law requires companies to prove their AI tool is free from racial and gender bias before it can be used.

AEDT Law, also known as Local Law 144, was introduced in 2021 and took effect this week, making New York City the first to pave the way for corporate AI regulations with others expected to follow suit. The tools are algorithms that use AI to make decisions about who to hire and/or promote, although the filtered selection is reportedly passed to a human for final review.

Companies will need to annually file a ‘bias audit’ and if they don’t comply, first-time offenders will be subject to a $500 fine and repeat violations can carry up to $1,500 fines. If a company doesn’t comply with the bias audit, it will face a fine, per AI tool, of up to $1,500 each day, according to Conductor AI.

The bias audit will calculate the selection rate and impact ratio for each category of people who were hired including male versus female categories, race/ethnicity, and intersectional categories combining sex, ethnicity, and race, according to the law.

Although using AI tools can significantly cut back on employers wading through hundreds of resumes, the risk is that the tool could mirror human stereotypes and discriminate against certain candidates.

“That’s the risk in all of this, that left unchecked, humans sometimes can’t even explain what data points the algorithm is picking up on. That’s what was largely behind this legislation,” John Hausknecht, a professor of human resources at Cornell University’s School of Industrial and Labor Relations, told CBS News. “It’s saying let’s track it, collect data, analyze it, and report it, so over time, we can make changes to the regulations.”

According to AEDT, if applicable, a company must provide alternative instructions for an applicant to “request an alternative selection process or a reasonable accommodation under other laws,” although employers are not required to offer an alternative selection process.

“We are only talking about those tools that take the place of humans making decisions,” Camacho Moran an employment attorney at Farrell Fritz told CBS. “If you have an AI tool that runs through 1,000 applications and says, ‘these are the top 20 candidates,’ that is clearly a tool that falls within the definition of an AEDT.”

The law may spread to other cities as the draw toward remote hiring becomes increasingly popular for companies, both in New York City and elsewhere. But the law is still limited, Julia Stoyanovich, a computer science professor at New York University and a founding member of the city’s Automatic Decisions Systems Task Force, told NBC News. She told the outlet that AEDT still doesn’t cover some important categories of applicants including those based on age or disability discrimination.

“First of all, I’m really glad the law is on the books, that there are rules now and we’re going to start enforcing them,” Stoyanovich told the outlet. “But there are also lots of gaps. So, for example, the bias audit is very limited in terms of categories. We don’t look at age-based discrimination, for example, which in hiring is a huge deal, or disabilities.”

It is still unclear how the AEDT law will be enforced, but a spokesperson for New York’s Department of Consumer and Worker Protection said the agency will “collect and investigate complaints” against companies.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment