A recent Radiolab episode considered the ethical implications of self-driving cars. These vehicles are usually lauded for how safe they’ll be: human error is removed, and precise programs will safely coordinate the high-speed movement of literal tons of metal and glass.
But a complication arises: even with this technology, it’s not possible to avoid all loss of life, so cars need to be programmed to minimize deaths when preventing them isn’t an option. Sometimes, for example, they’ll have to “decide” whether to run into a tree (likely killing the driver) or into a pedestrian, or between slamming into a bus full of kids or a four-door sedan. And because these programs will be written by humans, the developers have to do some moral math: how many kids outweigh one adult? What about a family vs. a group of friends? Locals vs. out-of-towners? Someone who looks healthy or looks ill? And take it further: A CEO or a janitor? A woman or a man? A black person or a white person? Right now, there are no industry standards for just how to write this decision-making into self-driving technology. But in speaking to Radiolab, Carnegie Mellon professor Raj Rajkumar made clear just who should not have the ultimate say:
We do not think that any programmer should be given this major burden of deciding who survives and who gets killed. I think these are very fundamental, deep issues that society has to decide at large. I don’t think a programmer eating pizza and sipping Coke should be making that call.
Sara Wachter-Boettcher would agree. In her book Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech, Wachter-Boettcher looks at who is designing the websites, apps, and products we use every day in every area of our lives and what that means for users of tech. It is just one group of people in charge of the industry, she writes: young, privileged white men.
This is not exactly news. Rajkumar’s pizza-and-Coke programmer pretty much nails the stereotype of “tech bros” (sometimes referred to as “brogrammers”), a stereotype Wachter-Boettcher confirms. She describes a culture dominated by a group of extremely educated, high-achieving men whose employment in the tech industry gives them license to do whatever they want. The result is “a group of mostly white guys from mostly the same places [that] believes it deserves to be at the top.” She describes a hyper-sexualized male environment that devalues the contributions of women. Recent reporting details drug-fueled parties and the rampancy of sexual harassment.
Wachter-Boettcher notes that tech companies have made a show of increasing diversity in their companies, but that little has resulted from these efforts. She explains that tech companies blame “the pipeline,” claiming that not enough women or people of color apply, and so it’s not their fault that their hires aren’t diverse. But the country’s top universities graduate black and Hispanic computer science and computer engineering students at twice the rate that they’re hired by tech companies. One black woman who hasn’t been able to find a tech job laments, “Instead of putting in the effort to look for us, Facebook is ignoring the fact that we even exist.”
This is bad enough. But Wachter-Boettcher is concerned with something else: how this lack of diversity affects the tech products we use, and how our societal biases are reinforced by their use. Having worked in the tech industry herself, she observes, “The more I started paying attention to how tech products are designed, the more I started noticing how often they’re full of blind spots, biases, and outright ethical blunders—and how often those oversights can exacerbate unfairness and leave vulnerable people out.” Because the industry is dominated by a very particular type of person, it’s easy for them to overlook the needs and concerns of other groups of people.