● One of the group’s co-founders, Subho Majumdar, tells us why it is important to raise awareness of ethical issues raised by AI among developers, decision makers and citizens.
● The Bias Buccaneers are hoping to attract international attention with further competitions, while encouraging engineers and developers to open up to a wider world and improve the integrity of AI systems they design.
“Renegades”, “misfits”, “buccaneers”… these are just some of the terms that the motley crew co-founded by Subho Majumdar uses to describe itself. By day, Majumdar manages artificial intelligence projects for a tech giant in Seattle, but when evening comes, he organizes highly unusual treasure hunts. His group, which favours swashbuckling language, has dubbed itself the “Bias Buccaneers”. Their goal is to organize competitions or bounties to encourage the AI community and the general public to seek out algorithmic biases in artificial intelligence models to ensure that these innovations remain on course for an ethical future that is respectful of data privacy. “Machine learning and artificial intelligence are perceived as black boxes”, points out Subho Majumdar. “It is also important for a wider public, and not only developers, to come to grips with the issue of biases”. The first bias bounty competition, held in December 2022, involved building a machine learning model to analyse images of faces and detect which ones were biased. The event , which was sponsored by companies like Microsoft, Amazon and the start-up Robust Intelligence, awarded prizes of up to 6,000 dollars.
From human bias to functional bias
Potentially thousands of biases have been absorbed by artificial intelligence models, which are currently being built at an unprecedented rate. In practice, and to return to the example of face detection, certain AIs have been found to confuse the faces of black people with animals. “Large language models on which AI is based always depend on context”, which is why it is important to examine the functioning of every AI on a case by case basis to identify biases they may contain. “In finance, for example, there is need to ensure that algorithms respect the law so as not to discriminate against certain population categories.” Although they are designed by humans, people are not always at the root of AI biases: this is notably the case with regard to performance biases, which can have catastrophic consequences. “Take the example of built-in systems in autonomous vehicles: they have to deliver the same level of performance under different weather conditions”, which requires them to be fairly trained on different frameworks.
Engineers and developers are not aware of ethical issues and there is insufficient dialogue with management teams
An opportunity for companies
Competitions to hunt down biases are beneficial for companies. Ben Colman, another of the founders of Bias Buccaneers, is also the CEO of the start-up Reality Defender, which specialises in the detection of deepfakes. As he explains, “organizing bounties to find biases in computer systems enables us to fine tune their detection capabilities”. At the same time, it allows developers to correct errors that could raise ethical issues: “For example, we studied a machine learning model that automatically cropped images. It worked for a few days and then we realized that when it processed images of women, the tool would reframe them in compromising positions”, points out Subho Majumdar.
Raising awareness among developers
The Bias Buccaneers’ goal is to is to continue to expand their community, while organizing more events and raising awareness among the general public. “The problem is that there are extensive guidelines on how AI should work in businesses, but no one knows how to apply them. This is due to the fact that engineers and developers are not aware of ethical issues, and there is insufficient dialogue with management teams.” This is why Subho Majumdar believes in the importance of creating a framework for dialogue between different professions within organizations. “I’m asking developers to reflect on the impact their technologies may have on people who do not resemble them and to create a dialogue with minorities, and also with other departments in their companies, which have a different approach to ethical issues.” At the same time, he is also relying on managers to improve their understanding of how developers work.