“We cannot solve our problems with the same thinking we used when we created them.” — Albert Einstein
At Atipica, we are constantly thinking about the biases of our world — from our everyday behaviors to the tech that reinforces the world’s prejudices and the people who built them.
For us, we know that automation in HR helps eliminate manual inefficiencies and help business hire talent faster. But we also worry about the logics behind the algorithms, and the people building them.
For example, in Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. She argues that data discrimination is a real social problem and that biased set of search algorithms privilege whiteness and discriminate against people of color, specifically women of color.
Because those who built these algorithms come from the sameness of shared experiences, they failed to understand the aggravation of their code on society.
We know this problem is rampant across industries and verticals — examples are many and even governments are paying attention to AI bias. Yesterday, the UK Parliament published a report pushing for diverse teams and representational data-sets as means to truly avoid biases being baked into AI algorithms.
“The main ways to address these kinds of biases are to ensure that developers are drawn from diverse gender, ethnic and socio-economic backgrounds, and are aware of, and adhere to, ethical codes of conduct.”
Here, we’re are atypically thinking how to build inclusion into our product life cycle — from the training sets to the UX.
This is especially critical in talent acquisition, as simple decisions of algorithms can perpetuate pattern matching and keep opportunities within a privileged few. From example, Google Hire recently launched their new candidate discovery tool:
“Candidates who received an offer in the past but declined it will rank higher than those who were previously rejected.
The simple decision to include past “qualified” candidates may make sense to some, but it actually.. actually excludes those compatible candidates from varied backgrounds who did not make it through the interviewing process, much less receive an offer letter.
Furthermore, this probably affects underrepresented candidates the most.
That is why we advocate for diversity in algorithm-building as a requirement for inclusion; the current state of talent and the future of work cannot be left at the hands of few to replicate human biases and pattern matching.
Atipica will help you understand and hire teams with inclusive AI that won’t reproduce biases.
Let us show you how. Contact us at firstname.lastname@example.org