MACHINE LEARNING REMOVES BIAS FROM ALGORITHMS AND THE HIRING PROCESS
At the premiere Machine Learning Conference (MLConf), Arena Analytics' Chief Data Scientist Patrick Hagerty unveiled a cutting edge technique that removes 92%-99% of latent bias from algorithmic models.
If undetected and unchecked, algorithms can learn, automate, and scale existing human and systemic biases. These models then perpetuate discrimination as they guide decision-makers in selecting people for loans, jobs, criminal investigation, healthcare services, and so much more. Currently, the primary methods of reducing the impact of bias on models has been limited to adjusting input data or adjust models after-the-fact to ensure there is no disparate impact.
Recent reporting from the Wall Street Journal confirmed these as the most recent advances, concluding, "It's really up to the software engineers and leaders of the company to figure out how to fix it… [or] go into the algorithm and tweak some of the main factors it considers in making its decisions."
For several years, Arena Analytics was also limited to these approaches, but that all changed 9 months ago. Up until then, Arena removed all data from the models that could correlate to protected classifications and then measured demographic parity.
"These efforts brought us in line with EEOC compliance thresholds - also known as the ⅘ or 80% rule," explains Myra Norton, President/COO of Arena. "But we've always wanted to go further than a compliance threshold. We've wanted to surface a MORE diverse slate of candidates for every single role in a client organization. And that's exactly what we've accomplished, now surpassing 95% in our representation of different classifications."
Chief Data Scientist Patrick Hagerty explained at MLConf the way he and his team have leveraged techniques known as adversarial networks, an aspect of Generative Adversarial Networks (GAN's), tools that pit one algorithm against another.
"Arena's primary model predicts the outcomes our clients want, and Model Two is a Discriminator designed to predict a classification," says Hagerty. "The Discriminator attempts to detect the race, gender, background, and any other protected class data of a person. This causes the Predictor to adjust and optimize while eliminating correlations with the classifications the Discriminator is detecting."
Arena trained models to do this until achieving what's known as the Nash Equilibrium. This is the point at which the predictor and discriminator have reached peak optimization.
Arena's technology has helped industrious individuals find a variety of jobs - from RNs to medtechs, caregivers to cooks, concierge to security. Job candidates who Arena predicted for success include veterans with no prior experience in healthcare or senior/assisted living, recent high school graduates whose plans to work while attending college were up-ended, and former hospitality sector employees who decided to apply their dining service expertise to a new setting.
"We succeeded in our intent to reduce bias and diversify the workforce, but what surprised us was the impact this approach had on our core predictions. Data once considered unusable, such as commuting distance, we can now analyze because we've removed the potentially-associated protected-class-signal," says Michael Rosenbaum, Arena's founder and CEO. "As a result, our predictions are stronger AND we surface a more diverse slate of candidates across multiple spectrums. Our clients can now use their talent acquisition function to really support and lead out front on Diversity and Inclusion."
Arena applies predictive analytics and machine learning to solve talent acquisition challenges. Learning algorithms analyze a large amount of data to predict with high levels of accuracy the likelihood of different outcomes occurring, such as someone leaving, being engaged, having excellent attendance, and more. By revealing each individual's likely outcomes in specific positions, departments, and locations, Arena is transforming the labor market from one based on perception and unconscious bias, to one based on outcomes. Arena is currently growing dramatically within the healthcare and hospitality industries and expanding to other people intensive industries. www.arena.io