If you’ve ever felt like your phone was reading your mind, you’re not alone. But for a lot of people, the way technology "reads" them is way more sinister than just showing a weirdly specific ad for sneakers. It’s about being flagged by an algorithm for a crime you didn't commit or being denied a loan by a piece of code that doesn't understand your life. This is the heart of Coded Justice by Stacey Abrams, a concept that explores how the invisible math running our world is often just as biased as the people who wrote it. Honestly, it’s a bit of a wake-up call. We like to think of computers as these objective, logical entities that can't be racist or sexist because they don't have feelings. But Stacey Abrams—alongside experts like Joy Buolamwini—is pointing out that if you feed a machine biased data, you’re going to get a biased result. Every single time.
The Myth of the Neutral Machine
We’ve been sold this lie that data is neutral. It isn't. Data is just a collection of our past, and our past is messy. When Stacey Abrams talks about coded justice, she's highlighting that software is a reflection of its creators. If the rooms where these algorithms are built lack diversity, the code will have blind spots. Big ones.
Take facial recognition. It’s a huge part of the Coded Justice by Stacey Abrams conversation. Studies, like the "Gender Shades" project by the Algorithmic Justice League, showed that commercial AI had an error rate of less than 1% for lighter-skinned men. But for darker-skinned women? The error rate spiked to nearly 35%. Think about that. If a police department uses that tech to identify suspects, who do you think is getting wrongly arrested? It’s not a hypothetical problem; it’s happening. Abrams argues that this isn't just a "glitch." It’s a systemic failure. When we outsource our justice system to unproven black-box algorithms, we aren't making things more efficient. We're just automating inequality.
How Bias Actually Gets Into the Code
How does a line of code become "biased"? It's not usually because a programmer is sitting there twirling a mustache and typing in "be mean to people." It’s much more boring and dangerous than that. Machine learning works by looking at patterns in historical data. If you use historical hiring data from a company that hasn't hired a woman in twenty years to train an AI, that AI is going to learn that "being a man" is a requirement for the job. It’ll start docking points from resumes that mention "women’s soccer" or "all-girls college."
Abrams focuses heavily on the voting rights and civic engagement side of this. In her work with organizations like Fair Fight, she’s seen how data can be used to suppress voices. Whether it’s how district lines are drawn using biased software or how voter rolls are "purged" using algorithms that flag names incorrectly, the tech is being used as a barrier. It’s a digital fence.
Real Stakes in the Real World
This stuff matters because it’s moving out of our screens and into our physical lives. It affects where police are sent (predictive policing), who gets bail, and who gets a mortgage.
- Healthcare: Algorithms used to predict which patients need extra care have been found to favor white patients over black patients because the code used "health care costs" as a proxy for "health needs." Since less money is traditionally spent on black patients due to systemic barriers, the AI thought they were "healthier" and didn't need the help.
- Housing: Digital "redlining" is real. Algorithms can target or exclude specific demographics from seeing housing ads without ever using a "race" variable. They just use "neighborhood" or "shopping habits" as a stand-in.
- Law Enforcement: In 2020, Robert Williams was arrested in his driveway in front of his kids because a facial recognition algorithm incorrectly matched his driver's license photo to grainy surveillance footage. It was a "hit" that shouldn't have been.
Abrams isn't saying we should smash the computers and go back to the Stone Age. She's a pragmatist. She’s saying we need oversight. We need a "Bill of Rights" for the digital age. Without it, Coded Justice by Stacey Abrams remains a warning rather than a reality.
The Push for Legislative Guardrails
You can't just ask tech companies to "be better." It doesn't work. The profit motive is too strong, and the "move fast and break things" culture is baked into Silicon Valley. Stacey Abrams has been vocal about the need for federal regulation. We need laws that require companies to audit their algorithms for bias before they’re deployed in public spaces.
The Algorithmic Accountability Act is one such attempt. It wants to force big companies to check their automated systems for impacts on privacy and civil rights. Abrams’ perspective adds a layer of urgency because she connects it directly to the survival of democracy. If people can’t trust the systems that govern their lives—from the ballot box to the courtroom—the whole thing starts to unravel.
What We Get Wrong About AI Fairness
A lot of people think "fairness" means making the AI colorblind. "Just take the race data out!" they say. Actually, that makes it worse. If you ignore race, the AI will just find "proxy" variables that correlate with race and use those instead. To fix the bias, you have to acknowledge it. You have to actively balance the data. You have to tell the machine: "Hey, the historical data is skewed, so you need to weight these factors differently."
It’s a bit like fixing a tilted house. You don’t just pretend the floor is level; you have to actively prop up the side that’s sinking.
🔗 Read more: Ring light clip for phone: Why your selfies still look grainy and how to fix it
The Path Toward Truly Just Tech
So, what do we do? It’s easy to feel helpless when the "algorithm" feels like a god-like force. But code is written by people, and people can be held accountable. Abrams emphasizes that "Coded Justice" is a choice we have to make. It involves:
- Transparency: We should have a right to know when an algorithm is making a decision about us. No more "secret sauce" explanations for why your insurance premium went up.
- Recourse: If a machine denies you a loan, there needs to be a human being you can talk to who has the power to overrule the machine.
- Representative Data: We need to stop using "convenience data" (the easiest data to find) and start using "representative data" (the data that actually reflects the world).
- Diverse Engineering: More women, more people of color, and more people from different socioeconomic backgrounds need to be building these tools.
[Image showing the importance of diverse datasets in training AI models]
Actionable Steps for the Rest of Us
You don't need to be a software engineer to participate in this. In fact, some of the most important work happens at the community level.
Check your local laws regarding facial recognition. Many cities—like San Francisco and Boston—have already banned or restricted its use by police. If your city hasn't, that's a conversation to start with your local representatives.
📖 Related: Apple Chestnut Hill: What Most People Get Wrong About This Boston Tech Hub
Support organizations like the Algorithmic Justice League or Fair Fight. These groups are on the front lines, doing the technical audits and the legal legwork that exposes where the code is failing us.
Demand "Explainable AI." When a company uses automated decision-making, ask for the "why." If they can't give you a clear, human-readable answer, that's a red flag. We shouldn't accept "the computer said so" as a valid legal or financial justification.
Read more about the intersections of race and tech. Books like Algorithms of Oppression by Safiya Umoja Noble or Race After Technology by Ruha Benjamin are great deep-dives that complement the themes in Coded Justice by Stacey Abrams.
Ultimately, technology is a tool. Like a hammer, it can be used to build a house or it can be used to break one down. Coded justice is about making sure we’re building something that actually has room for everyone. It's about ensuring that as we move into an increasingly automated future, we don't leave our humanity—or our civil rights—behind in the analog past.