HR Tech3 months

Algorithmic Bias Audit for Hiring Platform

Client

Enterprise Recruitment Platform

Overview

A recruitment platform needed to prove their new AI ranking system was non-discriminatory to meet new legislation.

The Challenge

Initial internal tests showed the model favored candidates from specific universities regardless of actual skill match.

Our Solution

We conducted a comprehensive bias audit using counterfactual testing and disparate impact analysis. We retrained the model using balanced datasets and implemented fairness constraints.

Tech Stack

Scikit-learnFairlearnPythonAWS SageMaker

Key Results

bias Reduction

Disparate impact ratio improved to 0.95 (compliance standard is 0.80)

legal Risk

Verified readiness for NYC AEDT Local Law 144

fairness

Demographic parity achieved across protected groups

Disparate impact ratio improved to 0.95

Fully compliant with NYC AEDT Local Law 144

Demographic parity achieved across protected groups

Ready to achieve similar results?

Kaycore Technologies | AI Quality Engineering & Validation