Hosting
Wednesday, February 5, 2025
Google search engine
HomeArtificial IntelligenceAI's dark secret: It's reversing progress on equality

AI’s dark secret: It’s reversing progress on equality


My life never fits into a pattern. My grandparents were refugees, my mother had me when she was 14 years old, and I developed massive behavioral problems as a teenager.

I didn’t grow up in typical circumstances. But I had the chance to beat the odds. However, if I had been born in the age of artificial intelligence (AI), could I still have gotten to where I am today? I have doubts.

You see, while I never fit into a pattern, AI is all about them.

AI systems, both predictive and generative, all function in the same way: they process massive amounts of data, identify patterns and strive to replicate them. The hidden truth of the world’s fastest growing technology is that machine learning systems struggle with differences.

Also read | Nobel Prize in Physics: AI pioneers Hopfield and Hinton win for machine learning foundations

Pattern is really the key word here, something that happens repeatedly. In a dataset, this means an attribute or characteristic that is common. In life it means something shared by a majority.

For example, a large language model like OpenAI’s ChatGPT “learns” grammatical patterns and uses them to generate human-like sentences. AI recruiting systems analyze patterns in the resumes of high-performing employees and look for similar traits in applicants.

Similarly, AI image screening tools used in medical diagnosis are trained on thousands of images depicting a specific condition, allowing them to detect similar features in new images. All these systems identify and reproduce majority patterns.

So if you write like most, work like most, and get sick like most, then AI is your friend. But if you deviate in any way from the majority patterns in the data and AI models, you become an outlier and over time you become invisible. Not available for hire. Untreatable.

Women of color have known this for a long time and have exposed AI biases in image recognition and medical treatments. In my own work, I have looked at how AI systems fail to properly identify and provide opportunities to women with Down syndrome, people living in low-income neighborhoods, and women who are victims of domestic violence.

In light of this growing body of evidence, it is surprising that we have not yet fully recognized that bias is not a flaw in AI systems. It’s a feature.

Prejudice is the challenge

Without specific interventions designed to promote fairness, identify and protect outliers, and make AI systems accountable, this technology threatens to undo decades of progress toward non-discriminatory, inclusive, fair, and democratic societies.

Nearly every effort to combat inequality in our world is currently being eroded by the AI ​​systems used to make decisions about who gets a job, a mortgage, medical treatment, who gets access to higher education, who pays bail, who is fired, or who is accused of plagiarism.

And it could be worse: history tells us that the road to authoritarianism is paved with discriminatory practices and the creation of a majority “us” versus a minority “them.”

We rely on systems built to identify majorities and replicate them at the expense of minorities. And that has an impact on everyone. Each of us may be a minority in specific contexts: you may have a majority skin color, but a minority combination of symptoms or medical history, and thus still be invisible to the systems that decide who receives medical treatment. You may have the best professional qualifications, but that gap in your resume, or that unusual name, makes you an outlier.

This doesn’t mean we shouldn’t use AI. But we cannot and should not deploy AI tools that do not protect outliers.

Bias in AI is like gravity for the aerospace industry. For aircraft manufacturers, gravity is the biggest challenge to overcome. If your plane can’t handle gravity, you don’t have a plane.

Also read | The AI ​​technological revolution: promised land or a pipe dream?

For AI, that challenge is a bias. And for the technology to get off the ground safely, its developers and implementers must build mechanisms that mitigate the irresistible force of the average, the common, the force of the pattern.

As an outlier, working in this space is not just a gift, it’s a responsibility. I am privileged to stand alongside trailblazing women like Cathy O’Neil, Julia Angwin, Rumman Chowdhury, Hilke Schellmann, and Virginia Eubanks, whose groundbreaking work exposes how current AI dynamics and priorities are failing innovation and society.

But more importantly, my work in AI allows me to honor the little me I once was. The clumsy, lost, clumsy girl who was given the opportunity to defy and beat the odds because they weren’t set in algorithmic stone.

That’s why reclaiming choice and opportunity from AI should not be a technical discussion, but the struggle of our generation.

This article first appeared on Context, powered by the Thomson Reuters Foundation.



Source link

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular