CertifAI https://certifai.fair-lab.com Certifying fairness in automated decision making Thu, 10 Aug 2023 22:37:35 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 https://certifai.fair-lab.com/2023/wp-content/uploads/2023/11/cropped-certifai-favicon-32x32.png CertifAI https://certifai.fair-lab.com 32 32 The Failures of AI in HR: A Strong Case for AI Audits https://certifai.fair-lab.com/the-failures-of-ai-in-hr-a-strong-case-for-ai-audits/ Tue, 18 Apr 2023 14:15:01 +0000 https://certifai.fair-lab.com/2023/?p=12 The increasing use of AI in various industries has led to a heightened concern for fairness, particularly in automated decision-making systems. In this article, we will discuss the importance of auditing AI pipelines and provide examples of AI biases in recruitment tools, applicant screening, and job advertising. Auditing AI pipelines ensures that biases are not perpetuated and that discrimination against certain groups can be uncovered and fixed.

One of the most significant issues with AI in the recruitment process is gender bias. In 2018, Amazon had to abandon its AI-driven recruitment tool after discovering that the algorithm exhibited gender bias against female candidates. The algorithm was trained on resumes submitted to the company over a ten-year period and developed a preference for male candidates, penalizing resumes that included references to women or women’s colleges.

Another example of AI bias is in applicant screening, specifically in facial recognition algorithms. A 2018 study by researchers at MIT and Stanford found that facial recognition algorithms used in hiring processes exhibited racial and gender bias. The algorithms were less accurate in recognizing the faces of dark-skinned individuals and women, which could result in discriminatory hiring practices.

Finally, AI can also perpetuate bias in job advertising. A study published in the journal PLoS ONE in 2021 found that an AI-driven job advertising platform exhibited gender bias in its ad targeting. The algorithm was more likely to display job ads for high-paying positions to men, while women were more likely to see ads for lower-paying jobs.

The three examples mentioned clearly show the importance of independent audits of AI systems before they are deployed. With the audits certifAI offers, we can identify biases and discrimination in models, advise on how to mitigate them, and certify AI pipelines to fulfill legal obligations.

If you are unsure about your AI pipeline, feel free to reach out to us. Together we can find a pragmatic solution with minimal impact on your workflow while ensuring your models do not exhibit unwanted biases. We believe that AI tools should promote diversity and inclusion rather than hindering it.

In conclusion, it is essential to audit AI pipelines to ensure that they are not perpetuating biases and discrimination. The examples discussed above highlight the need for AI audits in recruitment tools, applicant screening, and job advertising. As we continue to rely on AI, it is crucial that we hold these systems accountable and ensure that they promote fairness and inclusivity.

]]>
Key learnings from New York City’s Local Law 144 https://certifai.fair-lab.com/key-learnings-from-new-york-citys-local-law-144/ https://certifai.fair-lab.com/key-learnings-from-new-york-citys-local-law-144/#respond Tue, 18 Apr 2023 06:44:53 +0000 https://certifai.fair-lab.com/2023/?p=1 What is this new law?

New York City’s Local Law 144, passed in December 2021, aims to combat the issue of AI bias in employment decision-making. The law requires employers and employment agencies in New York City to conduct bias audits of their “automated employment decision tools” (AEDTs) used for hiring or promoting candidates who reside in the city.

How does a Bias audit look like?

A bias audit is a statistical evaluation of your AEDT. The audits result must be made public and employers must inform candidates or employees than an AEDT will be used at least 10 days prior to the tool’s implementation.

When will the law come into effect?

The law becomes effective on July 5th , 2023. Audits of AEDT’s are valid for a year.

Which tools are covered?

The definition of AEDTs covers any computational system (e.g. AI, statistics etc.) that outputs a simplified score that substantially assists or replaces discretionary decision making.

When does it apply?

If an AEDT is used for screening and hiring candidates or promoting existing employees residing in New York City, the AEDT must be audited.

Are you looking for an AI audit before the deadline?

Feel free to reach out to us – together we can ensure your AI is compliant.

]]>
https://certifai.fair-lab.com/key-learnings-from-new-york-citys-local-law-144/feed/ 0