Ethics in Tech: Moral Algorithms & Self-Driving Cars

Categories

Archives

Interior of a self driving car with a smart dashboard and windshield while person reads a book

Nassim Parvin, Associate Professor of Digital Media in Tech’s School of Literature, Media, and Communication and Director of Design and Social Justice Studio

Nassim Parvin HeadshotIn the early 20th century, when children were injured by cars, the ensuing court cases ruled overwhelmingly in favor of car owners and drivers, blaming the incidents on their mothers’ negligence.

“Well, of course it’s the mother’s fault!” says Nassim Parvin sarcastically. Parvin is an associate professor of Digital Media in Georgia Tech’s School of Literature, Media, and Communication, where she also directs the Design and Social Justice Studio.

“But you can draw a straight line from those decisions and the fact that it’s too dangerous for children to play in the streets now.” Cars and car companies have been given a lot of power over our streets and public spaces, something we continue to do today with the advent of autonomous vehicles. Self-driving cars may soon have the right to make life-and-death decisions, such as braking to avoid a pedestrian, but thereby putting the people in the car at risk. The car makes these decisions through moral algorithms that are programmed into their software. And according to Parvin, that gives the car too much power in too critical a scenario.

“Ethical situations are, by nature, ambiguous, but moral algorithms depend on certainty and clear rules to make decisions. They shouldn’t handle these ambiguous cases, such as the literal life-or-death situations of self-driving cars, or things like who should get health insurance or who is eligible for a loan, which also affects people’s lives.”

But are there bigger questions here? What if we invest in more public transportation options? What if we exchange parking lots and driving lanes with sidewalks, bike lanes, and green spaces that have been shown to improve the physical and civic health of our communities? Can we imagine a future when kids actually play in the streets?

“It’s a failure of ethical imagination if we say the question about self-driving cars is just a matter of life and death at the intersection,” Parvin says. “You have to think about what it’s like to live in a city where at any moment, you can be the target of a killing algorithm. Is that a city we want to live in? That’s the ethical question.”