These chapters (actually they were last week’s) cover employment . Here’s Bryan’s prompt.
On the hiring side, I’m not sure whether algorithmic arbitrariness or human arbitrariness is worse. I have a sense that, distinct from the expected biases (ethnicity, gender, geography/wealth) algorithms might bias for similarity. That is, they bias against candidates who have the larger skills to do the job, but whose previous job titles or majors aren’t a close word for word match for a job description. Of course humans might be just as likely to have that bias, but a human, if they wanted to think “outside the box” could at least be metacognitively aware of it.
I found the next chapter “Sweating Bullets” more alarming. The core of the problem is that outside of widget production for a factory worker or sales volume, the link between what an individual worker does and an institutional KPI is often tenuous. My instinct is that bad algorithms full of second or third order proxies make this much worse that a human based system with safeguards (such as something like 360 evaluation)
Did anyone else find the sociometric badge used in the call center (132) seriously creepy?
As to one of Bryan’s questions, about whether boycotts can provide a meaningful check on this sort of thing, it seems to me it might work in the public sector where transparency can be enforced via FOIA, but I have little hope for the private sphere. Boycotts sound good, but are rarely well enough organized or maintained to provoke real change.
Notes and Quotes
“…we’ve seen time and again that mathematical models can sift through data to locate people who are likely to face great challenges, whether from crime, poverty, or education. It’s up to society whether to use that intelligence to reject and punish them — or to reach out to them with the resources they need.” (118)
“The root of the trouble, as with so many other WMD’s, is the modeler’s choice of objectives. The model is optimized for efficiency and profitability, not for justice or the good of the ‘team.’ This is, of course, the nature of capitalism.” (129-130)
I was struck the other day by how similar Cory Doctorow’s whuffie system (from Down and Out in the Magic Kingdom), the rating system in the Black Mirror episode “Nosedive” and the Chinese social credit system I described in last week’s post are.