Chapter 1

What is Data Capitalism?

Big companies gain more and more power as the everyday person loses out.

Read Chapter 1

Chapter 2

Slavery, the Origin Story

Since slavery, corporations have used data to maintain racial inequality.

Read Chapter 2

Chapter 3

Stories of Resistance

Data capitalism today has a long history, but people are fighting back.

Read Chapter 3

Epilogue

Take Action

Learn what you can do, read more, and share this site.

Take Action

This Data for Black Lives project is based on Data Capitalism and Algorithmic Racism by Data for Black Lives and Demos.

Download the Report

Can we overcome data capitalism?

Yes! Getting here took intentional actions, decisions, and policies. So we can use intentional actions, decisions, and policies to get us out.

The fight has already begun.

Here are three stories about how organizations and individuals are breaking the loop between corporate profits and our data.

Choose one of the stories below to learn about how workers refused to be tracked by Amazon, how people took Facebook to court over racial discrimination in ad targeting, or how journalists uncovered racist algorithms in the car insurance market.

Workers refusing to be tracked by Amazon.

Adrienne Williams, a former delivery driver for Amazon, has been raising the alarm about Amazon’s invasive employee tracking practices since Spring 2020.

Amazon requires delivery drivers like Adrienne to download and continuously run a smartphone app called Mentor.  Mentor tracks drivers and gives them a score based on a set of unknown criteria. Numerous drivers claim the scores can be faulty and difficult to contest. Yet this one score determines a lot, including pay, for the predominantly Black and brown field and customer support workers.

Former Amazon delivery driver, Adrienne Williams, holding a protest sign that reads, “I should be able to pay rent if I work full time.”
A phone screen overlaying a sharecropper’s ledger and the Mentor app. Screen reads “risky driving score”.

Amazon’s constant tracking of their workers evolved from centuries of dehumanizing practices, practices that began in slavery. There is no doubt that slavery was incomparable in its brutality. Yet these practices continue to serve as a blueprint.

For example, after slavery, many recently freed slaves became sharecroppers. Though sharecroppers were technically free, sharecropping created a new system of entrapment.

Land owners would track sharecroppers’ every activity like how much they made and how much they sold. And then, they would always add in extra expenses. Sometimes sharecroppers would not be told what the expenses were until it came time to settle up, keeping them guessing and unable to control their circumstances. These calculations often led to sharecroppers having more debt than profits, indebted to the land owner in perpetuity.

A Black sharecropper from 1937 plowing behind an Amazon delivery truck with an upside down Amazon arrow.

Today, apps like Mentor help big companies find ways to underpay productive workers. Mentor is glitchy, dinging drivers for things completely outside their control, like getting a call from a family member - even if they did not pick up the call. Drivers say low scores can result in disciplinary actions, being taken off the work schedule, losing access to bonuses, and being removed from optimal delivery routes.

All of this is intentional. Amazon’s business model is based on controlling their workers, using data to supervise workers’ schedules down to every minute. It is no surprise that Amazon workers describe feeling completely dehumanized.

RWDSU union members and five U.S. House Representatives standing with a pro-union “Stronger Together” sign. In the background, an Amazon fulfillment building.

Amazon workers like Adrienne Williams are fed up. In May 2020, Adrienne organized a protest outside Amazon’s warehouses in Richmond, CA to demand change.

Adrienne is not alone. Amazon workers across the country are organizing. Over the past few years, Amazon workers from Minneapolis, Minnesota to Bessemer, Alabama have been leading the movement to demand a seat at the table.

Adrienne is not alone. Amazon workers across the country are organizing.

In the fight against data capitalism, these efforts, led by Black workers, redistribute power into the hands of the people.

A fist symbolizing black power overlaid on a green and orange circle.

Taking Facebook to court over racist ad targeting.

A judge’s gavel and block, engraved with the Facebook logo, sitting on top of page eighty-two of the Fair Housing Act.

A key part of Facebook’s business model is selling targeted ads. In theory, this means companies advertise to only those consumers who are most likely to buy their product. In practice, Facebook’s filters allow advertisers to act upon the same racial prejudices in advertising that have existed throughout our history.

When people discovered this, they were outraged. Several organizations and individuals filed class action lawsuits against Facebook for discriminatory advertising practices.

Racial discrimination in advertising is not new. For over a hundred years after the Civil War, Jim Crow laws and segregation practices allowed businesses to use explicitly racist advertising, like posting “whites only” signs in their windows. The size and scope of Facebook’s advertising reach makes it dangerous on a different scale.

A sign from the 1930s reading “Help Wanted White Only” inside a Facebook ad frame. Outside the frame, a Black person looks towards the sign.

Facebook’s platform allowed advertisers to filter out specific people shown an ad by “ethnic affinity,” effectively a category for race. What’s more, the ethnic affinity category only included categories for people of color, like “African American,” “Asian,” or “Hispanic.” White people were an unnamed default.

Regardless of whether you tell Facebook your race or ethnicity, Facebook collects your data (pictures, likes, groups) and uses a predictive algorithm to guess your race.

In March of 2019, Facebook settled the class action lawsuits about discrimination in advertising. In the settlement, they agreed to no longer allow advertisers to filter ads based on race, gender, or age.

Legal action has made a difference, but for lasting change we need oversight and regulation to prevent companies from profiting off of racism.

A fist symbolizing black power overlaid on a blue and yellow circle.

Uncovering racist algorithms in the car insurance market.

In 2017, journalists speaking to residents of Chicago started to notice an alarming trend.

In East Garfield Park, a primarily Black neighborhood, a Black resident was being charged $190.69 a month by Geico for car insurance. Across town in Lakeview, a whiter, more commercial part of Chicago, another Chicagoan paid $54.67—even though the vehicle crime rate in Lakeview is higher than in East Garfield Park.

A redlined map of Chicago from 1940 with a gecko, evocative of the Geico gecko, sitting on top. A circle drawn in red marker drawn over the map and gecko.

Car insurance companies are not transparent about the algorithms they use, but investigators at ProPublica and Consumer Reports got their hands on enough car insurance data to show that discrepancies in prices between Black and white neighborhoods are too wide to be simply explained by objective risk factors.

A Google map of Chicago with a rusty car overlayed on top. A red marker divides the map in two diagonally. The bottom left side is in black and white while the top right side is in color. On the map East Garfield is circled in red marker.

The reporting by ProPublica and Consumer Reports showed that zip codes were a factor in the algorithms that determine insurance rates. We know that zip code data is laced with the legacy of redlining and segregation.

From the 1930s to 1970s, banks used redlining to deny Black people loans, marking Black neighborhoods in red to designate them as too risky for loans. Denying Black communities access to loans in the past led to the racial wealth gap and continued residential segregation that exists today. Using zip code to determine insurance rates perpetuates the legacy of redlining’s racism.

Because car insurance companies do not willingly share their algorithms with regulators or the public, it is hard to make demands to address their algorithms' racist outcomes. We don't know if zip codes are the only variable in the algorithm that's producing racist results or if there are other factors contributing to these disparities.

A flashlight shines a light on a gecko, evocative of the Geico gecko, holding a red sharpie.

ProPublica and Consumer Reports have called out the link between corporate profits and our data, bringing us one step closer to breaking that link.

By knowing these algorithms exist and how they affect us, we can demand more transparency and democratic control over their application and impact.

A fist symbolizing black power overlaid on a green and yellow circle.
Photograph from the 1963 March on Washington, where marchers are seemingly marching out of a computer screen with the words “taking our power back” in a search bar.

Reading this website is just the beginning.

Questioning the link between corporate profits and our data is the first step. If you read this website, you are already here!

Fighting back requires our collective efforts to be conscious, curious, and active in the fight against data capitalism. Help other people learn about data capitalism by sharing this website.

What else can I do?

This is a big problem and unfortunately there aren’t simple answers. Systemic problems require systemic change.

What’s next must be defined by all of us. When we share the website, the next step is the conversation we have with each other.

What can we all do together? What are the small things? What are the wildest ideas we can imagine? What would it look like for us to have control over our data and the algorithms in our lives?

Looking for more specific solutions?

Read our full report. The report gives even more examples of how data capitalism and its racist history affect us today, and includes dozens of technical and policy ideas.

About

This website is based on the report Data Capitalism and Algorithmic Racism by Data for Black Lives and Demos.

Data for Black Lives is a movement of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black people.

Website content, design, and illustrations created by Akina Younge, Deepra Yusuf, Elyse Voegeli, and Jon Truong. In creating this site, we aimed to meet AA accessibility standards so that the site can be enjoyed by everyone.

Report design, style, and illustrations created by David Perrin, Senior Designer at Demos.

Special thanks to all the users and Data for Black Lives staff who helped test and improve this website throughout its development.

Love this project? Stay in touch! Data for Black Lives can be reached at info@d4bl.org .

Photos used in illustrations are from National Archives, Library of Congress, Wikimedia Creative Commons, Flickr Commons, Pexel, Unsplash, Ketut Subiyanto, J. P. Moquette, Jernej Furman, Virginie Goubier, Norma Mortenson, Florida State University Digital Repository’s Florida Manuscript Materials R.F. Van Brunt Store Day Book, Dorothea Lange Farm Security Administration collections, Tony Webster, Terri Sewell, Tingey Injury Law Firm, govinfo, Ferris State University’s Jim Crow Museum, CLAIN Dominique, Kara Zelasko, Dan Gold, Kelly Sikkema, Steve Johnson, and Michaela Kliková.

Data for Black lives logo. Links to Data for Black Lives site.