The entire concept of using statistical algorithms to 'predict crime' is wrong. It's just a kind of stereotyping.
What needs to happen is a consideration of the social-justice outcomes if 'profiling algorithms' become widely used. Just as in any complicated system, you cannot simply assume reasonable looking rules will translate to desirable emergent properties.
It is ethically imperative to aim to eliminate disparities and social inequalities between races, even if, and this is what is usually left unsaid, judgments become less accurate in the process.
Facts becoming common knowledge can harm people, even if they are true. Increasingly accurate profiling will have bad effects at the macro scale, and keep marginalized higher-crime groups permanently marginalized. If it were legal to use all the information to hand, it would be totally rational for employers to discriminate against certain groups on the basis of a higher group risk of crime, and that would result in those groups being marginalized even further. We should avoid this kind of societal positive feedback loop.
If you accept that government should want to avoid a segregated society, where some groups of people form a permanent underclass, you should avoid any algorithm that results in an increased differential arrest rate for those groups, even if that arrest rate is warranted by actual crimes committed.
"The social norm against stereotyping, including the opposition to profiling, has been highly beneficial in creating a more civilized and more equal society. It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong. The costs are worth paying to achieve a better society, but denying that the costs exist, while satisfying to the soul and politically correct, is not scientifically defensible. Reliance on the affect heuristic is common in politically charged arguments. The positions we favor have no cost and those we oppose have no benefits. We should be able to do better."
–Daniel Kahneman, Nobel laureate, in Thinking, Fast and Slow, chapter 16
> It is ethically imperative to aim to eliminate disparities and social inequalities between races, even if, and this is what is usually left unsaid, judgments become less accurate in the process.
Why? Why is it 'imperative' to be wrong?
> Facts becoming common knowledge can harm people, even if they are true.
Well, they can harm people who, statistically speaking, are more likely to be bad.
If anything, I see accurate statistical profiling being helpful to black folks. Right now, based on FBI arrest data, a random black man is 6.2 times as likely to be a murderer as a random white man; a good statistical profiling algorithm would be able to look at an individual black man and see that he's actually a married, college-educated middle-class recent immigrant from Africa, who lives in a low-crime area — and say that he's less likely than a random white man to be a murderer.
Perhaps it could even look at an individual black man, the son of a single mother from the projects, and see that he's actually not like others whom those phrases would describe, because of other factors the algorithm takes into account.
> If you accept that government should want to avoid a segregated society, where some groups of people form a permanent underclass, you should avoid any algorithm that results in an increased differential arrest rate for those groups, even if that arrest rate is warranted by actual crimes committed.
That statement implies that we should avoid the algorithm 'arrest anyone who has committed a crime, and no-one else,' because that algorithm will necessarily result in increased differential arrest rates. On the contrary, I think that algorithm is obviously ideal, and thus any heuristic which leads to rejecting it should itself be rejected.
> a random black man is 6.2 times as likely to be a murderer as a random white man
But I bet the likelihood of a random man or random person to be a murder is so low that "6.2 times" doesn't really tell you much about the underlying data.
> If anything, I see accurate statistical profiling being helpful to black folks.
But that's the thing... These statistical models aren't accurate, and we know that they aren't. The algorithms used for sentencing or predictive policing are just reflections of the creators, or more likely, the current legal system.
> It is ethically imperative to aim to eliminate disparities and social inequalities between races
Agreed 100%
> even if...judgments become less accurate in the process.
Absolutely not. You are actually advocating that because someone may come from a "marginalized higher-crime group" that they not be punished accurately should they commit a crime?
> If you accept that government should want to avoid a segregated society, where some groups of people form a permanent underclass
I agree that society should want that, but I don't agree that the government should act as a force to impose social justice upon society.
> you should avoid any algorithm that results in an increased differential arrest rate for those groups, even if that arrest rate is warranted by actual crimes committed.
Literally saying that someone from a "marginalized higher-crime group" should get away with criminal activity.
How is it "accurate" to punish someone not based on their own actions, but because they share traits with a particular group of people?
Prejudice doesn't become acceptable just because it's based on a complicated algorithm and a lot of data. Judge people for what they did instead of pre-judging them because they seem similar to a group you don't like.
The problem with stereotyping isn't stereotyping per se, it's that it makes people lazy and not bother to take new information into account when they become available.
What needs to happen is a consideration of the social-justice outcomes if 'profiling algorithms' become widely used. Just as in any complicated system, you cannot simply assume reasonable looking rules will translate to desirable emergent properties.
It is ethically imperative to aim to eliminate disparities and social inequalities between races, even if, and this is what is usually left unsaid, judgments become less accurate in the process.
Facts becoming common knowledge can harm people, even if they are true. Increasingly accurate profiling will have bad effects at the macro scale, and keep marginalized higher-crime groups permanently marginalized. If it were legal to use all the information to hand, it would be totally rational for employers to discriminate against certain groups on the basis of a higher group risk of crime, and that would result in those groups being marginalized even further. We should avoid this kind of societal positive feedback loop.
If you accept that government should want to avoid a segregated society, where some groups of people form a permanent underclass, you should avoid any algorithm that results in an increased differential arrest rate for those groups, even if that arrest rate is warranted by actual crimes committed.
"The social norm against stereotyping, including the opposition to profiling, has been highly beneficial in creating a more civilized and more equal society. It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong. The costs are worth paying to achieve a better society, but denying that the costs exist, while satisfying to the soul and politically correct, is not scientifically defensible. Reliance on the affect heuristic is common in politically charged arguments. The positions we favor have no cost and those we oppose have no benefits. We should be able to do better."