Thinking About Algorithmic Transparency
Jennifer A. Stark, Ph.D.
Nicolas Diakopoulos, Ph.D.
Blind Acceptance, "Not our fault"
What are Engineers told? What information are they given?
Find algorithm/organisation at fault
Find algorithm reveals embedded societal disparities
Typically the two are tightly related
Difficult to identify a solution
"Because with Uber there is no destination discrimination — no refusals based on what you look like or where you live."
– Uber Under the Hood, Medium
– Via Uber's Developer API late 2016
Average UberX estimated wait times
Percent time spent surging (Proportion)
Average Surge price multiplier per census tract
- During surge only
- During non-surge
- Just before surge (trigger)
UberX Overall Average Wait Times
What information can we gather to reproduce wait times across D.C.?
Information to Explore - Census
Population => density
Poverty => %
Median Household Income
Race / Ethnicity => dichotomised to % POC
Run multiple regression analysis.
“Cycles, Systems, & Loops”
Transient Population (where people go)
Unbanked (no credit card = barrier for Uber App)
Demand (Proxy: Taxi data)
Risk - Proxy: Violent Crime => 5yrs, density
Spatial Regression - spatial adjacency
- Proxy: Non-violent Crime => 5yrs, density
- Proxy: Google Places API => density
Wait times reflect questionable ethics (bad Uber!?)
- Ineffective control of drivers
Wait times reveal disadvantaged communities (bad Government!?)
Uber policy? – Change algorithm? Change driver-incentives? Make data available?
Government policy? – Ban Uber? Regulate Uber? Invest in those communities so they can participate in the technological future
Public attitude? – informed decisions
And also …
Encourage better documentation and code commenting (good for future you and colleagues)
Build trust with readers
Catalyse new projects